Dec 02 22:57:40 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 22:57:40 crc restorecon[4707]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:40 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 22:57:41 crc restorecon[4707]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 22:57:41 crc kubenswrapper[4903]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.429128 4903 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431799 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431816 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431822 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431828 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431832 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431836 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431840 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431845 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431849 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431853 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431857 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431869 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431873 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431877 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431881 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431885 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431890 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431894 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431898 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431902 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431906 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431910 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431914 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431919 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431922 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431927 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431931 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431935 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431939 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431944 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431947 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431950 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431954 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431958 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431961 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431965 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431968 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431972 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431975 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431979 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431983 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431986 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431989 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431994 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.431998 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432003 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432008 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432012 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432016 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432020 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432024 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432028 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432033 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432036 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432040 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432044 4903 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432048 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432051 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432055 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432058 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432062 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432066 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432069 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432072 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432076 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432080 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432083 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432086 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432090 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432093 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.432096 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432360 4903 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432371 4903 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432380 4903 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432386 4903 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432392 4903 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432396 4903 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432401 4903 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432407 4903 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432411 4903 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432415 4903 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432420 4903 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432424 4903 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432430 4903 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432436 4903 flags.go:64] FLAG: --cgroup-root="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432440 4903 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432444 4903 flags.go:64] FLAG: --client-ca-file="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432448 4903 flags.go:64] FLAG: --cloud-config="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432452 4903 flags.go:64] FLAG: --cloud-provider="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432455 4903 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432460 4903 flags.go:64] FLAG: --cluster-domain="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432464 4903 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432468 4903 flags.go:64] FLAG: --config-dir="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432472 4903 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432476 4903 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432481 4903 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432485 4903 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432489 4903 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432493 4903 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432498 4903 flags.go:64] FLAG: --contention-profiling="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432502 4903 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432506 4903 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432510 4903 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432514 4903 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432519 4903 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432523 4903 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432527 4903 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432531 4903 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432536 4903 flags.go:64] FLAG: --enable-server="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432540 4903 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432545 4903 flags.go:64] FLAG: --event-burst="100" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432549 4903 flags.go:64] FLAG: --event-qps="50" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432553 4903 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432557 4903 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432561 4903 flags.go:64] FLAG: --eviction-hard="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432566 4903 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432571 4903 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432575 4903 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432579 4903 flags.go:64] FLAG: --eviction-soft="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432584 4903 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432587 4903 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432591 4903 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432596 4903 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432600 4903 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432604 4903 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432608 4903 flags.go:64] FLAG: --feature-gates="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432613 4903 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432617 4903 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432621 4903 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432625 4903 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432629 4903 flags.go:64] FLAG: --healthz-port="10248" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432633 4903 flags.go:64] FLAG: --help="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432637 4903 flags.go:64] FLAG: --hostname-override="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432641 4903 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432645 4903 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432662 4903 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432666 4903 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432670 4903 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432674 4903 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432678 4903 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432682 4903 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432686 4903 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432690 4903 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432694 4903 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432698 4903 flags.go:64] FLAG: --kube-reserved="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432702 4903 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432705 4903 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432709 4903 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432715 4903 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432719 4903 flags.go:64] FLAG: --lock-file="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432723 4903 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432726 4903 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432730 4903 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432741 4903 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432746 4903 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432749 4903 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432753 4903 flags.go:64] FLAG: --logging-format="text" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432757 4903 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432762 4903 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432765 4903 flags.go:64] FLAG: --manifest-url="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432769 4903 flags.go:64] FLAG: --manifest-url-header="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432774 4903 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432779 4903 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432783 4903 flags.go:64] FLAG: --max-pods="110" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432788 4903 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432792 4903 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432796 4903 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432800 4903 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432804 4903 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432808 4903 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432812 4903 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432821 4903 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432825 4903 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432829 4903 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432833 4903 flags.go:64] FLAG: --pod-cidr="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432837 4903 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432843 4903 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432847 4903 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432851 4903 flags.go:64] FLAG: --pods-per-core="0" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432855 4903 flags.go:64] FLAG: --port="10250" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432860 4903 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432864 4903 flags.go:64] FLAG: --provider-id="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432869 4903 flags.go:64] FLAG: --qos-reserved="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432873 4903 flags.go:64] FLAG: --read-only-port="10255" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432877 4903 flags.go:64] FLAG: --register-node="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432881 4903 flags.go:64] FLAG: --register-schedulable="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432886 4903 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432892 4903 flags.go:64] FLAG: --registry-burst="10" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432896 4903 flags.go:64] FLAG: --registry-qps="5" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432901 4903 flags.go:64] FLAG: --reserved-cpus="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432905 4903 flags.go:64] FLAG: --reserved-memory="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432910 4903 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432914 4903 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432918 4903 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432923 4903 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432927 4903 flags.go:64] FLAG: --runonce="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432931 4903 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432935 4903 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432939 4903 flags.go:64] FLAG: --seccomp-default="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432944 4903 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432947 4903 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432952 4903 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432956 4903 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432960 4903 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432965 4903 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432969 4903 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432973 4903 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432977 4903 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432981 4903 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432985 4903 flags.go:64] FLAG: --system-cgroups="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432990 4903 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.432996 4903 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433000 4903 flags.go:64] FLAG: --tls-cert-file="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433004 4903 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433009 4903 flags.go:64] FLAG: --tls-min-version="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433013 4903 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433017 4903 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433021 4903 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433025 4903 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433029 4903 flags.go:64] FLAG: --v="2" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433035 4903 flags.go:64] FLAG: --version="false" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433040 4903 flags.go:64] FLAG: --vmodule="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433045 4903 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433049 4903 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433154 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433161 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433165 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433169 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433175 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433179 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433183 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433187 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433191 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433194 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433198 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433201 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433205 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433208 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433212 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433215 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433219 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433222 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433226 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433230 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433233 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433237 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433241 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433244 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433248 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433251 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433255 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433258 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433262 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433265 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433268 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433272 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433275 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433279 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433283 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433288 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433292 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433297 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433301 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433304 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433309 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433313 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433316 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433320 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433323 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433327 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433331 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433334 4903 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433338 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433341 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433346 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433350 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433354 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433358 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433362 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433365 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433369 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433373 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433376 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433380 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433384 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433387 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433391 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433394 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433398 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433401 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433405 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433409 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433412 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433420 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.433424 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.433429 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.445484 4903 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.445532 4903 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445694 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445708 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445716 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445726 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445735 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445743 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445750 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445758 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445766 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445774 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445784 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445792 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445800 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445807 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445815 4903 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445824 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445831 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445839 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445847 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445855 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445863 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445871 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445881 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445894 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445903 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445911 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445919 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445927 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445934 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445945 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445955 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445966 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445975 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445983 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445991 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.445998 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446006 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446013 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446021 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446029 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446039 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446050 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446058 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446066 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446074 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446082 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446090 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446098 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446106 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446113 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446121 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446129 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446136 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446144 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446152 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446160 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446169 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446178 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446186 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446194 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446203 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446211 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446219 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446227 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446235 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446243 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446251 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446258 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446266 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446273 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446281 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.446295 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446515 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446527 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446535 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446544 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446553 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446561 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446570 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446578 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446586 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446595 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446603 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446612 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446619 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446627 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446635 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446644 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446675 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446683 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446690 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446699 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446706 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446714 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446722 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446730 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446738 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446746 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446754 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446792 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446800 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446808 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446816 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446824 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446831 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446842 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446852 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446861 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446869 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446878 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446886 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446894 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446902 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446910 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446917 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446925 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446932 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446942 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446953 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446963 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446973 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446983 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.446993 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447002 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447010 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447018 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447027 4903 feature_gate.go:330] unrecognized feature gate: Example Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447036 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447043 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447051 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447059 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447066 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447074 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447082 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447089 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447097 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447104 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447112 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447122 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447132 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447142 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447151 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.447160 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.447172 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.447646 4903 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.454213 4903 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.454471 4903 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.455780 4903 server.go:997] "Starting client certificate rotation" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.455815 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.456030 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-15 05:59:50.996061285 +0000 UTC Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.456136 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.463955 4903 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.465869 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.466715 4903 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.478893 4903 log.go:25] "Validated CRI v1 runtime API" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.500598 4903 log.go:25] "Validated CRI v1 image API" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.502643 4903 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.506465 4903 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-22-52-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.506518 4903 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.532403 4903 manager.go:217] Machine: {Timestamp:2025-12-02 22:57:41.53001059 +0000 UTC m=+0.238564953 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3787324b-4c61-413b-8321-e9e2f283e2ad BootID:5eef24b0-10c3-4ee6-bf4d-784c2d2e5050 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5a:21:fd Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5a:21:fd Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d4:e7:5f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:18:81:6a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2f:80:3a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b6:71:f2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:2f:61:85:d7:6b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:ad:66:a3:47:62 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.532828 4903 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.533016 4903 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.533800 4903 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534114 4903 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534168 4903 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534491 4903 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534520 4903 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534797 4903 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.534986 4903 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.535203 4903 state_mem.go:36] "Initialized new in-memory state store" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.535521 4903 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.536848 4903 kubelet.go:418] "Attempting to sync node with API server" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.536880 4903 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.536996 4903 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.537019 4903 kubelet.go:324] "Adding apiserver pod source" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.537038 4903 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.541282 4903 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.543105 4903 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.544004 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.544194 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.544022 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.544296 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.544505 4903 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545416 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545493 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545519 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545533 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545555 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545570 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545585 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545606 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545621 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545635 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545685 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.545699 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.546246 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.546946 4903 server.go:1280] "Started kubelet" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.547220 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.547367 4903 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.547366 4903 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.548382 4903 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 22:57:41 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.550354 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.550414 4903 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.551041 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:42:36.427894673 +0000 UTC Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.551699 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.551725 4903 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.551746 4903 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.551836 4903 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.552396 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.552523 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.549859 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d88170adbef4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 22:57:41.546913615 +0000 UTC m=+0.255467938,LastTimestamp:2025-12-02 22:57:41.546913615 +0000 UTC m=+0.255467938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.553139 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.555908 4903 server.go:460] "Adding debug handlers to kubelet server" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.556185 4903 factory.go:55] Registering systemd factory Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.556205 4903 factory.go:221] Registration of the systemd container factory successfully Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.557456 4903 factory.go:153] Registering CRI-O factory Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.557501 4903 factory.go:221] Registration of the crio container factory successfully Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.557606 4903 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.557640 4903 factory.go:103] Registering Raw factory Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.557709 4903 manager.go:1196] Started watching for new ooms in manager Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.558731 4903 manager.go:319] Starting recovery of all containers Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568101 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568166 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568190 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568211 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568229 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568250 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568267 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568285 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568307 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568342 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568365 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568390 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568411 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568437 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568464 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568490 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568518 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568593 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568623 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568647 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568718 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568745 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568772 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568798 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568825 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568845 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.568869 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569003 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569041 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569062 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569081 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569098 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569119 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569136 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569155 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569171 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569189 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569208 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569225 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569242 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569260 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569277 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569295 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569313 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569334 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569355 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569372 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569390 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569411 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569460 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569479 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569497 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569524 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569545 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569565 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569586 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569612 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569635 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569685 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569703 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569720 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569751 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569769 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569786 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569805 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569822 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569839 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569859 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569877 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569895 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569911 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569929 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569948 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569965 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.569983 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570001 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570030 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570086 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570104 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570121 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570139 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570157 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570175 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570193 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570211 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570230 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570247 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570264 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570283 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570301 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570320 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570338 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570355 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570375 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570394 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570411 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570430 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570448 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570465 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570484 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570529 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570547 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570565 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570585 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570620 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570646 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570697 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570720 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570744 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570797 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570820 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570838 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570861 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570884 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570911 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570936 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570960 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570979 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.570997 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571016 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571033 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571051 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571071 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571088 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571105 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571122 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571143 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571162 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571180 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571196 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571216 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571233 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571251 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571268 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571286 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571303 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571322 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571339 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571360 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571378 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571397 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571416 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571433 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571450 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571469 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571489 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571507 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571523 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571541 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571559 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571578 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571600 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571624 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.571646 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573133 4903 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573779 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573810 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573823 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573833 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573843 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573857 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573867 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573877 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573889 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573901 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573911 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573923 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573933 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573945 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573954 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573965 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.573974 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574000 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574010 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574712 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574793 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574819 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574840 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574861 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574881 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574899 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574917 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574934 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574951 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574968 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.574991 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575008 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575026 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575046 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575063 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575081 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575100 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575120 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575140 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575161 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575180 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575196 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575214 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575234 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575252 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575269 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575287 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575313 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575336 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575361 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575383 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575400 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575417 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575436 4903 reconstruct.go:97] "Volume reconstruction finished" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.575449 4903 reconciler.go:26] "Reconciler: start to sync state" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.587161 4903 manager.go:324] Recovery completed Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.602522 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.604200 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.604269 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.604287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.605080 4903 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.605101 4903 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.605121 4903 state_mem.go:36] "Initialized new in-memory state store" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.607870 4903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.611051 4903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.611095 4903 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.611119 4903 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.611164 4903 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 22:57:41 crc kubenswrapper[4903]: W1202 22:57:41.612412 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.612694 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.614466 4903 policy_none.go:49] "None policy: Start" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.616763 4903 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.616818 4903 state_mem.go:35] "Initializing new in-memory state store" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.652308 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.673900 4903 manager.go:334] "Starting Device Plugin manager" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.674022 4903 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.674051 4903 server.go:79] "Starting device plugin registration server" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.674796 4903 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.674834 4903 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.675050 4903 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.675203 4903 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.675213 4903 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.685359 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.711472 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.711571 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.712478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.712541 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.712560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.712793 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.712945 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713005 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713807 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713877 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713923 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713949 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.713966 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.714040 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.714281 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.714350 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715158 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715187 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715195 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715293 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715493 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715539 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715856 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.715879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717400 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717418 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717873 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717897 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.717908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.718030 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.718142 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.718170 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.719964 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720185 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720212 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720230 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720437 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.720474 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.721775 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.721812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.721827 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.753730 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.775383 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777146 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777262 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777396 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777796 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777855 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.777960 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778040 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778077 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778112 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778145 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778232 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778276 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778307 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778361 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778425 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778516 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.778568 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.778737 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.879972 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880042 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880130 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880192 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880220 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880301 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880314 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880220 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880327 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880415 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880480 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880487 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880533 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880553 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880599 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880637 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880702 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880720 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880730 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880781 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880867 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880915 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880784 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.880869 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.979424 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.981512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.981599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.981629 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:41 crc kubenswrapper[4903]: I1202 22:57:41.981701 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:41 crc kubenswrapper[4903]: E1202 22:57:41.982189 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.038568 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.057916 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.072139 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.074646 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-126946a0f207a8f0acb2f23bf854f34166ad0bd868160c7de6273ec893740238 WatchSource:0}: Error finding container 126946a0f207a8f0acb2f23bf854f34166ad0bd868160c7de6273ec893740238: Status 404 returned error can't find the container with id 126946a0f207a8f0acb2f23bf854f34166ad0bd868160c7de6273ec893740238 Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.088038 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f747bd1e60f16d151ed884a0f492afd895768a7761eb955440dfbb1d310d8094 WatchSource:0}: Error finding container f747bd1e60f16d151ed884a0f492afd895768a7761eb955440dfbb1d310d8094: Status 404 returned error can't find the container with id f747bd1e60f16d151ed884a0f492afd895768a7761eb955440dfbb1d310d8094 Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.092300 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5c143c6373d613f127c8200e469646f1824e8b07d75f41d850610bc2a5845c5b WatchSource:0}: Error finding container 5c143c6373d613f127c8200e469646f1824e8b07d75f41d850610bc2a5845c5b: Status 404 returned error can't find the container with id 5c143c6373d613f127c8200e469646f1824e8b07d75f41d850610bc2a5845c5b Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.110996 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.123637 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.141738 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a82d8e19a7b60ed6aa032ac8a4c129218beb6b4ec5595e2c10a676f76acd9098 WatchSource:0}: Error finding container a82d8e19a7b60ed6aa032ac8a4c129218beb6b4ec5595e2c10a676f76acd9098: Status 404 returned error can't find the container with id a82d8e19a7b60ed6aa032ac8a4c129218beb6b4ec5595e2c10a676f76acd9098 Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.151903 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-75f83d009e2e6e0ba1d40c7b3fbbc9c080cc335a64c3162b1f93b7fb4a15a852 WatchSource:0}: Error finding container 75f83d009e2e6e0ba1d40c7b3fbbc9c080cc335a64c3162b1f93b7fb4a15a852: Status 404 returned error can't find the container with id 75f83d009e2e6e0ba1d40c7b3fbbc9c080cc335a64c3162b1f93b7fb4a15a852 Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.154590 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.382470 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.384600 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.384681 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.384697 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.384731 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.385245 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.419182 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.419279 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.486772 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.487251 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:42 crc kubenswrapper[4903]: W1202 22:57:42.542412 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.542516 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.548112 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.551162 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 08:35:29.188613542 +0000 UTC Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.618511 4903 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc" exitCode=0 Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.618606 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.618835 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"75f83d009e2e6e0ba1d40c7b3fbbc9c080cc335a64c3162b1f93b7fb4a15a852"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.618969 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.620556 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.620598 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.620610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.623407 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.623450 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a82d8e19a7b60ed6aa032ac8a4c129218beb6b4ec5595e2c10a676f76acd9098"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.625538 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd" exitCode=0 Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.625579 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.625623 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c143c6373d613f127c8200e469646f1824e8b07d75f41d850610bc2a5845c5b"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.625787 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.626735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.626760 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.626772 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.627147 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="27f98db1683931c1d65960a08ef2b7fe0d6d97d362f5435dbc20346c9ee79abb" exitCode=0 Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.627225 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"27f98db1683931c1d65960a08ef2b7fe0d6d97d362f5435dbc20346c9ee79abb"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.627265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f747bd1e60f16d151ed884a0f492afd895768a7761eb955440dfbb1d310d8094"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.627384 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.628621 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.628637 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.628702 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.628721 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.629497 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.629534 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.629552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.630157 4903 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039" exitCode=0 Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.630180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.630199 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"126946a0f207a8f0acb2f23bf854f34166ad0bd868160c7de6273ec893740238"} Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.630249 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.631097 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.631129 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:42 crc kubenswrapper[4903]: I1202 22:57:42.631148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:42 crc kubenswrapper[4903]: E1202 22:57:42.956800 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Dec 02 22:57:43 crc kubenswrapper[4903]: W1202 22:57:43.103414 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Dec 02 22:57:43 crc kubenswrapper[4903]: E1202 22:57:43.103484 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.185374 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.186858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.186908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.186920 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.186951 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:43 crc kubenswrapper[4903]: E1202 22:57:43.187944 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.552995 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:03:59.035018643 +0000 UTC Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.553051 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 377h6m15.481970319s for next certificate rotation Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.635140 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.635309 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.636738 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.636771 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.636779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.639819 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.639845 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.639856 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.639940 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.640764 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.640798 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.640811 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.643093 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.643125 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.643139 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.643219 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.644364 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.644386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.644397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.653788 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.653866 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.653888 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.653908 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.657258 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="029a3ad2cdb54925355602dbd246a088a9d5bb0b20bc6da2945cfc6843711803" exitCode=0 Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.657306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"029a3ad2cdb54925355602dbd246a088a9d5bb0b20bc6da2945cfc6843711803"} Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.657454 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.658735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.658779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.658797 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:43 crc kubenswrapper[4903]: I1202 22:57:43.666415 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.247019 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.256567 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.302323 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.663532 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="07af7bc135d35fbb1e3f2e2be87aa112ceaadb2cb846b8af1583d8aabca3d83f" exitCode=0 Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.663597 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"07af7bc135d35fbb1e3f2e2be87aa112ceaadb2cb846b8af1583d8aabca3d83f"} Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.663812 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.665073 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.665118 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.665135 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.669121 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168"} Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.669186 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.669311 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670540 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670562 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670713 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.670837 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.788682 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.790265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.790324 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.790342 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:44 crc kubenswrapper[4903]: I1202 22:57:44.790377 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677232 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2e815e640c6ff8db33804d3de5d4825fde92c1200a31db299b7ab968b6a33edd"} Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677314 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677369 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677316 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bf7df453a9e604c1c1b6288f69033a838f9072c366f10b88faae5d32d1d8dcec"} Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677443 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.677467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9ec34d2590c36fdbe9a7a39e9149dd5f87250ac93b758d2fb0815828a44f949"} Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.678867 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.678914 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.678932 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.678912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.679092 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.679117 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:45 crc kubenswrapper[4903]: I1202 22:57:45.925233 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.686951 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c7946bb5bf584893edf9a166c06fdc937e203f300b4d638b30ffcfa39b98b58c"} Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.687018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"767b5b251b8c5f68bf74aa9865b664293850c5f2047cba9c04c3b19f45c93265"} Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.687061 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.687135 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.688070 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.688555 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.688623 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.688642 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.689182 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.689222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:46 crc kubenswrapper[4903]: I1202 22:57:46.689239 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.508265 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.690169 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.690245 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.690269 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.691961 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.692148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.692265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.692042 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.692479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:47 crc kubenswrapper[4903]: I1202 22:57:47.692499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.380365 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.650856 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.651549 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.653185 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.653245 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.653264 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.693105 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.694352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.694394 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:48 crc kubenswrapper[4903]: I1202 22:57:48.694466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:49 crc kubenswrapper[4903]: I1202 22:57:49.671566 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:57:49 crc kubenswrapper[4903]: I1202 22:57:49.671835 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:49 crc kubenswrapper[4903]: I1202 22:57:49.673216 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:49 crc kubenswrapper[4903]: I1202 22:57:49.673370 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:49 crc kubenswrapper[4903]: I1202 22:57:49.673495 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:51 crc kubenswrapper[4903]: E1202 22:57:51.688029 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.541756 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.541997 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.543452 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.543479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.543487 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.547344 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.704285 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.705717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.705803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:52 crc kubenswrapper[4903]: I1202 22:57:52.705827 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.160421 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.548595 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 22:57:53 crc kubenswrapper[4903]: E1202 22:57:53.668782 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.709114 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.711768 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.711836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:53 crc kubenswrapper[4903]: I1202 22:57:53.711855 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:54 crc kubenswrapper[4903]: W1202 22:57:54.156629 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 22:57:54 crc kubenswrapper[4903]: I1202 22:57:54.156756 4903 trace.go:236] Trace[1878636373]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:57:44.154) (total time: 10002ms): Dec 02 22:57:54 crc kubenswrapper[4903]: Trace[1878636373]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (22:57:54.156) Dec 02 22:57:54 crc kubenswrapper[4903]: Trace[1878636373]: [10.002014358s] [10.002014358s] END Dec 02 22:57:54 crc kubenswrapper[4903]: E1202 22:57:54.156782 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 22:57:54 crc kubenswrapper[4903]: E1202 22:57:54.557981 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 02 22:57:54 crc kubenswrapper[4903]: E1202 22:57:54.792229 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 02 22:57:54 crc kubenswrapper[4903]: I1202 22:57:54.868355 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 22:57:54 crc kubenswrapper[4903]: I1202 22:57:54.868413 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 22:57:54 crc kubenswrapper[4903]: I1202 22:57:54.873412 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 22:57:54 crc kubenswrapper[4903]: I1202 22:57:54.873460 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 22:57:55 crc kubenswrapper[4903]: I1202 22:57:55.934320 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]log ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]etcd ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/priority-and-fairness-filter ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-apiextensions-informers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-apiextensions-controllers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/crd-informer-synced ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-system-namespaces-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/bootstrap-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/start-kube-aggregator-informers ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-registration-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-discovery-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]autoregister-completion ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-openapi-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 02 22:57:55 crc kubenswrapper[4903]: livez check failed Dec 02 22:57:55 crc kubenswrapper[4903]: I1202 22:57:55.934378 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.160554 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.160620 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.296172 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.296477 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.298115 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.298206 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.298226 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.330574 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.718105 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.719599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.719696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.719718 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:56 crc kubenswrapper[4903]: I1202 22:57:56.741335 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.720730 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.721632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.721748 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.721770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.895742 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.915863 4903 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.993179 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.995020 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.995074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.995092 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:57:57 crc kubenswrapper[4903]: I1202 22:57:57.995154 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:57:58 crc kubenswrapper[4903]: E1202 22:57:58.000917 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 22:57:58 crc kubenswrapper[4903]: I1202 22:57:58.452282 4903 csr.go:261] certificate signing request csr-whwc2 is approved, waiting to be issued Dec 02 22:57:58 crc kubenswrapper[4903]: I1202 22:57:58.458293 4903 csr.go:257] certificate signing request csr-whwc2 is issued Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.459867 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-02 22:52:58 +0000 UTC, rotation deadline is 2026-08-27 05:21:11.033452417 +0000 UTC Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.459934 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6414h23m11.573523902s for next certificate rotation Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.857225 4903 trace.go:236] Trace[855931200]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:57:44.909) (total time: 14947ms): Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[855931200]: ---"Objects listed" error: 14947ms (22:57:59.857) Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[855931200]: [14.947431161s] [14.947431161s] END Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.857270 4903 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.858069 4903 trace.go:236] Trace[1195754836]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:57:45.256) (total time: 14601ms): Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[1195754836]: ---"Objects listed" error: 14601ms (22:57:59.857) Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[1195754836]: [14.601863063s] [14.601863063s] END Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.858113 4903 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.859991 4903 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.861153 4903 trace.go:236] Trace[946475643]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 22:57:45.844) (total time: 14016ms): Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[946475643]: ---"Objects listed" error: 14016ms (22:57:59.861) Dec 02 22:57:59 crc kubenswrapper[4903]: Trace[946475643]: [14.016513651s] [14.016513651s] END Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.861176 4903 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.914060 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51662->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 22:57:59 crc kubenswrapper[4903]: I1202 22:57:59.914121 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51662->192.168.126.11:17697: read: connection reset by peer" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.427250 4903 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.548629 4903 apiserver.go:52] "Watching apiserver" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.554955 4903 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.555436 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zf29d","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.555961 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.556067 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.556177 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.556238 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.556477 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.556604 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.557645 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.557873 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.558065 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.558197 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.564121 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.565026 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.565569 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.565815 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.565982 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.566197 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.566424 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.566677 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.566830 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.567823 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.567951 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.568572 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.587960 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.611197 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.627391 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.653071 4903 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.654216 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.665181 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668123 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668155 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668175 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668194 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668210 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668225 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668241 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668257 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668297 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668327 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668356 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668371 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668385 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668400 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668416 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668441 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668462 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668479 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668493 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668525 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668540 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668556 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668572 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668602 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668619 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668636 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668685 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668708 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668724 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668741 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668773 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668788 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668825 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668843 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668858 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668879 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668897 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668931 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668946 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668975 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668994 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669009 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669039 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669054 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669069 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669085 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669101 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669116 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669133 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669147 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669162 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669177 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669191 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669206 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668458 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669233 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668540 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668606 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668618 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668751 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668755 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668773 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668796 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668922 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.668947 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669020 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669061 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669208 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669386 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669426 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669487 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669584 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669627 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669688 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669737 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669755 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669847 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669900 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670031 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670148 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670195 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670214 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670294 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670331 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670361 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670444 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670462 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670505 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670583 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.669221 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670623 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670643 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670673 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670672 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670667 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670706 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670733 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670791 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670780 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670808 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670831 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670847 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670857 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670886 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670906 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670922 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670940 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670956 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670971 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670986 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.670986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671001 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671017 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671052 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671073 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671089 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671103 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671107 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671119 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671177 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671202 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671214 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671243 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671265 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671281 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671308 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671309 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671345 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671384 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671402 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671419 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671435 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671458 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671481 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671500 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671520 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671534 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671550 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671564 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671579 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671594 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671612 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671633 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671681 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671704 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671720 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671739 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671753 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671769 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671785 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671806 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671822 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671838 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671854 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671868 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671884 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671900 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671917 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671932 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671949 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671965 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671983 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671999 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672014 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672029 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672045 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672066 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672083 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672100 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672119 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672148 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672169 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672193 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672216 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672235 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672258 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672282 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672311 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672333 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672352 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672369 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672386 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672404 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672423 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672441 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672457 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672494 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672513 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672531 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672549 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672564 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672579 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673038 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673063 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673081 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673098 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673113 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673130 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673145 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673163 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673187 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673204 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673222 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673240 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673256 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673273 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673308 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673343 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673365 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673386 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673407 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673430 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673446 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673462 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673477 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673493 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673508 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673524 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673540 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673556 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673574 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673612 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673635 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673672 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673694 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673715 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673735 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673750 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673765 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673793 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673811 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673827 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673846 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673862 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673878 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673894 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673912 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673930 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674013 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674038 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674060 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674080 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674098 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674116 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f517677-195d-4d43-ba46-e0f0aede7011-hosts-file\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674152 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674196 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674213 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674231 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674248 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dt6q\" (UniqueName: \"kubernetes.io/projected/8f517677-195d-4d43-ba46-e0f0aede7011-kube-api-access-8dt6q\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674268 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674306 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674361 4903 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674372 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674382 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674391 4903 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674403 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674413 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674423 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674433 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674444 4903 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674453 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674462 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674472 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674481 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674491 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674501 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674511 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674521 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674530 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674539 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674552 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674613 4903 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674624 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674720 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674730 4903 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674739 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674749 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674763 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674771 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674781 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674790 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674800 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674809 4903 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674819 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674828 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674837 4903 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674846 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674856 4903 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674866 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674875 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674884 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674894 4903 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674904 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674913 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674921 4903 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674930 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674942 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674953 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674966 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674978 4903 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.674989 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675002 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675013 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675025 4903 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671416 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682388 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671572 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671592 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671625 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671775 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671788 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671922 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.671965 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672007 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672052 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672163 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672226 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672303 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672332 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672394 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672482 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672599 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.672736 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.673184 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675850 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675875 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.675974 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.676367 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.676529 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.677140 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.677762 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.677910 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.677986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.678195 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.679256 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.679308 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.679559 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.679920 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.680061 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.680708 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.680811 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681176 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681186 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681232 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681497 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681508 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681573 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.681833 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682162 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682342 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682473 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682553 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682606 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682760 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.682838 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683093 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683107 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683185 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683449 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683453 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683458 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683529 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683579 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683885 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683903 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.683913 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.684204 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.684723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.684938 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685011 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685029 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685067 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685302 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685360 4903 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685429 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685638 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685700 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.685891 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686226 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686237 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686309 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686332 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686527 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686567 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686646 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.686727 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.686808 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:01.186787247 +0000 UTC m=+19.895341530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.686810 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687180 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687240 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687260 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687398 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.687407 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:01.187388552 +0000 UTC m=+19.895942845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687508 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687545 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687741 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.687949 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688087 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688128 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688299 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688364 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688373 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.688602 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.688685 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.689010 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.689015 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.689022 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.689053 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.689065 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.689064 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.689340 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.689981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.690443 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.690753 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.693864 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.697484 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.700371 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.700486 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.700557 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:01.200537984 +0000 UTC m=+19.909092267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.700710 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.701063 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.701073 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.701181 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.701832 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.702219 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.702342 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.702372 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:01.202317617 +0000 UTC m=+19.910871900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.702413 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.702504 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.702529 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:00 crc kubenswrapper[4903]: E1202 22:58:00.702610 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:01.202583204 +0000 UTC m=+19.911137487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.702851 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.703057 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.703724 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.703827 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.704321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.704676 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.704727 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.704853 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.705024 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.705027 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.705126 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.705769 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.705920 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.706365 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.714345 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.714939 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.716280 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.716469 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.716956 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717266 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717354 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717405 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717519 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717416 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.717971 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.718426 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.718770 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.718849 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.720016 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.720227 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.720498 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.720584 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.721240 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.721437 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.730821 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.733034 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.733490 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168" exitCode=255 Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.733546 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168"} Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.746089 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.746609 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.746876 4903 scope.go:117] "RemoveContainer" containerID="8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.752708 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.758779 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.767499 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778311 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778731 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778767 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f517677-195d-4d43-ba46-e0f0aede7011-hosts-file\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dt6q\" (UniqueName: \"kubernetes.io/projected/8f517677-195d-4d43-ba46-e0f0aede7011-kube-api-access-8dt6q\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778850 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778861 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778871 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778880 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778889 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778897 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778906 4903 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778915 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778924 4903 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778932 4903 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778941 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778949 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778957 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778966 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778974 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778983 4903 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.778993 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779001 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779009 4903 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779017 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779025 4903 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779034 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779043 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779051 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779060 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779080 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779089 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779098 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779106 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779116 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779124 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779133 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779141 4903 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779149 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779158 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779168 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779179 4903 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779188 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779197 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779206 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779220 4903 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779229 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779225 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f517677-195d-4d43-ba46-e0f0aede7011-hosts-file\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779238 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779306 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779321 4903 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779334 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779349 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779363 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779375 4903 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779407 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.779556 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780648 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780705 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780721 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780733 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780745 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780757 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780772 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780784 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780796 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780807 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780818 4903 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780829 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780841 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780853 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780867 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780879 4903 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780890 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780902 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780914 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780925 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780937 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780949 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780963 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780975 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.780987 4903 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781000 4903 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781012 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781024 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781035 4903 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781046 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781059 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781070 4903 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781082 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781094 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781105 4903 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781117 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781129 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781140 4903 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781152 4903 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781164 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781176 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781187 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781198 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781209 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781222 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781236 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781247 4903 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781258 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781270 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781284 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781295 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781307 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781402 4903 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781421 4903 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781437 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781454 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781469 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781483 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781498 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781513 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781526 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781539 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781552 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781690 4903 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781710 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781727 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781740 4903 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781754 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781765 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781801 4903 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781813 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781825 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781939 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.781980 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782158 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782252 4903 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782266 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782279 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782327 4903 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782376 4903 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782390 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782402 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782415 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782426 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782438 4903 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782461 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782472 4903 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782485 4903 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782496 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782508 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782522 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.782534 4903 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.787883 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.798301 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dt6q\" (UniqueName: \"kubernetes.io/projected/8f517677-195d-4d43-ba46-e0f0aede7011-kube-api-access-8dt6q\") pod \"node-resolver-zf29d\" (UID: \"8f517677-195d-4d43-ba46-e0f0aede7011\") " pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.803119 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.811425 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.880921 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.889200 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.899478 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.906850 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf29d" Dec 02 22:58:00 crc kubenswrapper[4903]: W1202 22:58:00.915674 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bc91d61cd29a2d86d8eac3f4a50c0e0287090e84f667bb77bd0865a082c5a31f WatchSource:0}: Error finding container bc91d61cd29a2d86d8eac3f4a50c0e0287090e84f667bb77bd0865a082c5a31f: Status 404 returned error can't find the container with id bc91d61cd29a2d86d8eac3f4a50c0e0287090e84f667bb77bd0865a082c5a31f Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.933809 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:58:00 crc kubenswrapper[4903]: W1202 22:58:00.937992 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f517677_195d_4d43_ba46_e0f0aede7011.slice/crio-62102afd097069614d1afad8c1ef4b8d783f50daae8939afbcfae317ab8bf056 WatchSource:0}: Error finding container 62102afd097069614d1afad8c1ef4b8d783f50daae8939afbcfae317ab8bf056: Status 404 returned error can't find the container with id 62102afd097069614d1afad8c1ef4b8d783f50daae8939afbcfae317ab8bf056 Dec 02 22:58:00 crc kubenswrapper[4903]: W1202 22:58:00.940141 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-244627d1213f32a886012a34dc71a857b61e53b2a10b9f5de7194e45f47add56 WatchSource:0}: Error finding container 244627d1213f32a886012a34dc71a857b61e53b2a10b9f5de7194e45f47add56: Status 404 returned error can't find the container with id 244627d1213f32a886012a34dc71a857b61e53b2a10b9f5de7194e45f47add56 Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.951131 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.968164 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:00 crc kubenswrapper[4903]: I1202 22:58:00.993691 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.017157 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.035074 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.053730 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.077407 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.095085 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.286036 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.286100 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.286123 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.286145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.286163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286249 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:02.286216646 +0000 UTC m=+20.994770939 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286297 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286312 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286324 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286375 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:02.2863619 +0000 UTC m=+20.994916183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286368 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286406 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286500 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286508 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:02.286469332 +0000 UTC m=+20.995023825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286517 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286602 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:02.286575525 +0000 UTC m=+20.995130008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286622 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: E1202 22:58:01.286685 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:02.286676707 +0000 UTC m=+20.995230990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.457275 4903 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457778 4903 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457821 4903 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457854 4903 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457899 4903 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457922 4903 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457932 4903 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457936 4903 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457855 4903 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457872 4903 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457970 4903 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457984 4903 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: W1202 22:58:01.457983 4903 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.616076 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.616761 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.617721 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.618459 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.619172 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.619773 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.620471 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.621134 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.621958 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.622582 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.623203 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.624018 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.624647 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.628261 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.628903 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.630178 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.631886 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.636681 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.637646 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.638372 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.639434 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.640172 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.640700 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.642461 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.643408 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.645483 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.646599 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.647411 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.647942 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.648429 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.648775 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.649209 4903 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.649303 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.651303 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.651838 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.652251 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.653637 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.654588 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.655102 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.656054 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.656752 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.657516 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.658093 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.659066 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.659610 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.660023 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.660413 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.660926 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.662435 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.664486 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.666498 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.667009 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.667848 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.668343 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.669006 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.669863 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.677221 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.702967 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.721293 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s4nbg"] Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.721872 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.721910 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-snl4q"] Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.722592 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.726147 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.726463 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.726642 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.726751 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.727001 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.727120 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.727380 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.729313 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.730800 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.731282 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.736667 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tjcvg"] Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.736833 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.741664 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.745565 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc91d61cd29a2d86d8eac3f4a50c0e0287090e84f667bb77bd0865a082c5a31f"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.747680 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.747934 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.769024 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.770110 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.771373 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.771702 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.772930 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.772958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"80d7b82750ed839ba2ada35b4b8a534bcb71fd1a05c25040e85477cf1405e42f"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.774280 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf29d" event={"ID":"8f517677-195d-4d43-ba46-e0f0aede7011","Type":"ContainerStarted","Data":"76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.774322 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf29d" event={"ID":"8f517677-195d-4d43-ba46-e0f0aede7011","Type":"ContainerStarted","Data":"62102afd097069614d1afad8c1ef4b8d783f50daae8939afbcfae317ab8bf056"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.776158 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.776203 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.776215 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"244627d1213f32a886012a34dc71a857b61e53b2a10b9f5de7194e45f47add56"} Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.777420 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.783962 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789491 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-bin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789538 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ef11e3b-7757-4286-9684-6d4cd3bf924f-proxy-tls\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789565 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789592 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-kubelet\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789615 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-hostroot\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789675 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cnibin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789702 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-multus\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789757 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-etc-kubernetes\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ef11e3b-7757-4286-9684-6d4cd3bf924f-mcd-auth-proxy-config\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789825 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-os-release\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789859 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8jjp\" (UniqueName: \"kubernetes.io/projected/6b5d599c-d246-4f24-93ea-ace730325f84-kube-api-access-f8jjp\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789908 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789935 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-multus-certs\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.789959 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf98z\" (UniqueName: \"kubernetes.io/projected/3ef11e3b-7757-4286-9684-6d4cd3bf924f-kube-api-access-rf98z\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cni-binary-copy\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790077 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-k8s-cni-cncf-io\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790140 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-cnibin\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790174 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-os-release\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790206 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-daemon-config\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790224 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2mm\" (UniqueName: \"kubernetes.io/projected/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-kube-api-access-cz2mm\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790245 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-socket-dir-parent\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790263 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-netns\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790283 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3ef11e3b-7757-4286-9684-6d4cd3bf924f-rootfs\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790300 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790319 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-system-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790334 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-conf-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790354 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-system-cni-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.790372 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.795085 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.805769 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.817866 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.829384 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.840025 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.855813 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.866848 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.881253 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891143 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cni-binary-copy\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891191 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-k8s-cni-cncf-io\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891226 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-cnibin\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891268 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-os-release\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891296 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-daemon-config\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891319 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2mm\" (UniqueName: \"kubernetes.io/projected/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-kube-api-access-cz2mm\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891343 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-socket-dir-parent\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891351 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-k8s-cni-cncf-io\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891422 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-netns\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891363 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-cnibin\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891368 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-netns\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-socket-dir-parent\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891480 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3ef11e3b-7757-4286-9684-6d4cd3bf924f-rootfs\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891499 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-system-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891536 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-conf-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891551 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-system-cni-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891575 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891597 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-bin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891617 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ef11e3b-7757-4286-9684-6d4cd3bf924f-proxy-tls\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891681 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891678 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-os-release\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891699 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-kubelet\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891741 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3ef11e3b-7757-4286-9684-6d4cd3bf924f-rootfs\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891776 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-hostroot\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891838 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cnibin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-multus\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891884 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-etc-kubernetes\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891906 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ef11e3b-7757-4286-9684-6d4cd3bf924f-mcd-auth-proxy-config\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891926 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-os-release\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891969 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8jjp\" (UniqueName: \"kubernetes.io/projected/6b5d599c-d246-4f24-93ea-ace730325f84-kube-api-access-f8jjp\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892016 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892039 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-multus-certs\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892059 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf98z\" (UniqueName: \"kubernetes.io/projected/3ef11e3b-7757-4286-9684-6d4cd3bf924f-kube-api-access-rf98z\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892175 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892224 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-system-cni-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892255 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-conf-dir\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892285 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-system-cni-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.891725 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-kubelet\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892333 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-os-release\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892529 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-hostroot\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892753 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892818 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-multus\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892825 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-etc-kubernetes\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.892867 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cnibin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-var-lib-cni-bin\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893141 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-multus-daemon-config\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893234 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-host-run-multus-certs\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893676 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b5d599c-d246-4f24-93ea-ace730325f84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893764 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ef11e3b-7757-4286-9684-6d4cd3bf924f-mcd-auth-proxy-config\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.893781 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b5d599c-d246-4f24-93ea-ace730325f84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.894328 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-cni-binary-copy\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.905505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ef11e3b-7757-4286-9684-6d4cd3bf924f-proxy-tls\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.905812 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.908755 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8jjp\" (UniqueName: \"kubernetes.io/projected/6b5d599c-d246-4f24-93ea-ace730325f84-kube-api-access-f8jjp\") pod \"multus-additional-cni-plugins-tjcvg\" (UID: \"6b5d599c-d246-4f24-93ea-ace730325f84\") " pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.911488 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf98z\" (UniqueName: \"kubernetes.io/projected/3ef11e3b-7757-4286-9684-6d4cd3bf924f-kube-api-access-rf98z\") pod \"machine-config-daemon-snl4q\" (UID: \"3ef11e3b-7757-4286-9684-6d4cd3bf924f\") " pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.914628 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2mm\" (UniqueName: \"kubernetes.io/projected/a689512c-b6fd-4ffe-af54-dbb8f45ab9e5-kube-api-access-cz2mm\") pod \"multus-s4nbg\" (UID: \"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\") " pod="openshift-multus/multus-s4nbg" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.918632 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.929998 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:01 crc kubenswrapper[4903]: I1202 22:58:01.947883 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.037941 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4nbg" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.087882 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.093299 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" Dec 02 22:58:02 crc kubenswrapper[4903]: W1202 22:58:02.118388 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5d599c_d246_4f24_93ea_ace730325f84.slice/crio-8d0ad6232416a74ae882430bb999ee7e074ee8862acf53c749ab28a0d4810e6d WatchSource:0}: Error finding container 8d0ad6232416a74ae882430bb999ee7e074ee8862acf53c749ab28a0d4810e6d: Status 404 returned error can't find the container with id 8d0ad6232416a74ae882430bb999ee7e074ee8862acf53c749ab28a0d4810e6d Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.125042 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz9ff"] Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.129304 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.133983 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.134177 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.134264 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.134334 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.134954 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.135172 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.136108 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.147362 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.164986 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.185808 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195176 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195194 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195231 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195330 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195381 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195475 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195495 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195529 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195547 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lp5\" (UniqueName: \"kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195567 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195587 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195694 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195711 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195739 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195763 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195780 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.195802 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.205481 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.220880 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.242065 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.258127 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.269979 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.273605 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.288425 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296402 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296438 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296487 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296536 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296562 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296587 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lp5\" (UniqueName: \"kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296632 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296675 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296697 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296719 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296730 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296742 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296785 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296803 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296831 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296849 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296870 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296886 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.296907 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.296931 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296934 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.296946 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296958 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.296994 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:04.296976181 +0000 UTC m=+23.005530524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297145 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297244 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297292 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297299 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:04.297277298 +0000 UTC m=+23.005831581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297323 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297342 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297352 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297403 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297450 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:04.297438772 +0000 UTC m=+23.005993105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.296913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297489 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297513 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297537 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297561 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297625 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297625 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297638 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297788 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297803 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297813 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297849 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:04.297837902 +0000 UTC m=+23.006392185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.297879 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:04.297873883 +0000 UTC m=+23.006428166 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297895 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.297943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.298531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.298727 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.304468 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.318853 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.320703 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lp5\" (UniqueName: \"kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5\") pod \"ovnkube-node-pz9ff\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.339600 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.360393 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.406987 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.411824 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.442562 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.444908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 22:58:02 crc kubenswrapper[4903]: W1202 22:58:02.452400 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ab90b8_4bb9_418c_8b55_19c4c10edec7.slice/crio-2e8025f3ad329533b26e7396411910b27aa262df5ca5ebfc9375e593401aa1d4 WatchSource:0}: Error finding container 2e8025f3ad329533b26e7396411910b27aa262df5ca5ebfc9375e593401aa1d4: Status 404 returned error can't find the container with id 2e8025f3ad329533b26e7396411910b27aa262df5ca5ebfc9375e593401aa1d4 Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.511592 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.573555 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.585159 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.610390 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.611781 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.611934 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.612365 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.612448 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.612519 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:02 crc kubenswrapper[4903]: E1202 22:58:02.612586 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.779690 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.780221 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerStarted","Data":"4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.780255 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerStarted","Data":"f8dc55e038c502b3a1ca3b0d1e45941293c6acf9f76802e163b57e2aa98007f2"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.782908 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" exitCode=0 Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.782959 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.782979 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"2e8025f3ad329533b26e7396411910b27aa262df5ca5ebfc9375e593401aa1d4"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.784809 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b" exitCode=0 Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.784853 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.784869 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerStarted","Data":"8d0ad6232416a74ae882430bb999ee7e074ee8862acf53c749ab28a0d4810e6d"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.788310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.788339 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.788359 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"2cfb2d88b43ac9422870078fe3a5825bd7109d9e3f0f5d2234f89e0122bd0122"} Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.795739 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.805976 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.837181 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.859691 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.895768 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.913240 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.931012 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.946889 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.959447 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.972609 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:02 crc kubenswrapper[4903]: I1202 22:58:02.985876 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.003784 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.016866 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.017400 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.031873 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.043184 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.044816 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.058207 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.071189 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.092793 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.105789 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.118764 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.133342 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.148241 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.166735 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.169764 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.172593 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.181442 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.189091 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.204249 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.226624 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.242062 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.264645 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.278149 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vhm2r"] Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.278704 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.281790 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.283009 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.283009 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.284305 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.285701 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.312766 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.323708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-host\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.323811 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-serviceca\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.323910 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv57c\" (UniqueName: \"kubernetes.io/projected/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-kube-api-access-xv57c\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.340942 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.364863 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.392357 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.416780 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.424831 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv57c\" (UniqueName: \"kubernetes.io/projected/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-kube-api-access-xv57c\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.424878 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-host\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.424931 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-serviceca\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.426020 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-serviceca\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.426368 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-host\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.443733 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.465996 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv57c\" (UniqueName: \"kubernetes.io/projected/a9f99205-0a32-4a74-ad9e-c0a79aa66d1b-kube-api-access-xv57c\") pod \"node-ca-vhm2r\" (UID: \"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\") " pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.477896 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.499311 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.523614 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.541992 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.560889 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.578192 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.593324 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vhm2r" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.602789 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: W1202 22:58:03.614241 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f99205_0a32_4a74_ad9e_c0a79aa66d1b.slice/crio-216faf41ae18eff2ad1f9fc1a7a502311a9898d4d9a07e73049928f4845ae80e WatchSource:0}: Error finding container 216faf41ae18eff2ad1f9fc1a7a502311a9898d4d9a07e73049928f4845ae80e: Status 404 returned error can't find the container with id 216faf41ae18eff2ad1f9fc1a7a502311a9898d4d9a07e73049928f4845ae80e Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.627053 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.668716 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.701366 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.744924 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.781559 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.801471 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vhm2r" event={"ID":"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b","Type":"ContainerStarted","Data":"216faf41ae18eff2ad1f9fc1a7a502311a9898d4d9a07e73049928f4845ae80e"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.805008 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.805049 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.805058 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.805067 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.806513 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.808450 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894" exitCode=0 Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.808494 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894"} Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.819952 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.862584 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.898310 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.943271 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:03 crc kubenswrapper[4903]: I1202 22:58:03.981432 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.027918 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.062065 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.104189 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.144488 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.183355 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.222934 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.267706 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.317311 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.337127 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.337260 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.337290 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337380 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:08.337341571 +0000 UTC m=+27.045895884 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337400 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337421 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337434 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337482 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:08.337465964 +0000 UTC m=+27.046020257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337486 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337538 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.337503 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337576 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:08.337556256 +0000 UTC m=+27.046110549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.337626 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337682 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:08.337672789 +0000 UTC m=+27.046227082 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337725 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337741 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337756 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.337796 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:08.337783202 +0000 UTC m=+27.046337615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.346918 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.380786 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.401203 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.403345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.403407 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.403425 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.403536 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.418448 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.471779 4903 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.472107 4903 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.473635 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.473718 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.473735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.473758 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.473776 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.497447 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.503142 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.503200 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.503220 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.503246 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.503265 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.506560 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.523378 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.529179 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.529236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.529255 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.529282 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.529304 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.544683 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.548400 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.554171 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.554230 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.554249 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.554276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.554294 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.576503 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581039 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581101 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581118 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581167 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.581617 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.601128 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.601283 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.604271 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.604303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.604355 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.604375 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.604385 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.611556 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.611790 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.611896 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.612260 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.612388 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:04 crc kubenswrapper[4903]: E1202 22:58:04.612527 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.711315 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.711376 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.711393 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.711423 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.711440 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.814722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.814774 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.814791 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.814816 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.814833 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.819392 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.819452 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.827959 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2" exitCode=0 Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.828039 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.831568 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vhm2r" event={"ID":"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b","Type":"ContainerStarted","Data":"5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.849609 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.871027 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.892594 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.913265 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.919462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.919553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.919590 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.919626 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.919649 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:04Z","lastTransitionTime":"2025-12-02T22:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.945200 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:04 crc kubenswrapper[4903]: I1202 22:58:04.979682 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.001124 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.015389 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.023303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.023352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.023362 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.023380 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.023394 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.033049 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.081117 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.097451 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.110132 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.125440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.125482 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.125497 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.125518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.125534 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.129643 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.146736 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.184284 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.222687 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.228898 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.228940 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.228952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.228972 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.228985 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.265934 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.304031 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.332051 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.332130 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.332144 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.332169 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.332188 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.355836 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.382473 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.427495 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.435399 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.435476 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.435500 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.435532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.435553 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.466842 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.503122 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.538997 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.539060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.539078 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.539102 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.539120 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.542961 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.584809 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.623497 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.642438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.642498 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.642515 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.642544 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.642563 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.664563 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.702700 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.745452 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.745607 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.745637 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.745700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.745725 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.840710 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58" exitCode=0 Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.840806 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.848685 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.848737 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.848757 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.848783 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.848802 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.864915 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.881370 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.901532 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.919553 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.943991 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.951209 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.951419 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.951550 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.951759 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.951896 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:05Z","lastTransitionTime":"2025-12-02T22:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.960861 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:05 crc kubenswrapper[4903]: I1202 22:58:05.986376 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.029470 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.055718 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.055764 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.055780 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.055805 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.055827 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.075956 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.105594 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.142446 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.186358 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.220462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.220531 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.220553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.220580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.220597 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.225945 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.262244 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.323912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.323945 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.323955 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.323971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.323982 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.427348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.427393 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.427404 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.427426 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.427438 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.531227 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.531292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.531309 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.531336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.531354 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.611410 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.611492 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.611440 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:06 crc kubenswrapper[4903]: E1202 22:58:06.611698 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:06 crc kubenswrapper[4903]: E1202 22:58:06.611937 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:06 crc kubenswrapper[4903]: E1202 22:58:06.612168 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.633857 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.633901 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.633912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.633930 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.633943 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.736717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.736777 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.736794 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.736819 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.736837 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.839820 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.839866 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.839877 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.839894 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.839906 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.852583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.861495 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e" exitCode=0 Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.861541 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.880870 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.898715 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.914275 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.934853 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.942849 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.942896 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.942912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.942940 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.942957 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:06Z","lastTransitionTime":"2025-12-02T22:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.957104 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:06 crc kubenswrapper[4903]: I1202 22:58:06.979739 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.005346 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.023952 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.043507 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.044971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.045008 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.045043 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.045064 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.045075 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.059486 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.078433 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.109482 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.128465 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.142526 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.147533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.147592 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.147610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.147641 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.147693 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.250946 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.250999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.251015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.251038 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.251054 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.354335 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.354397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.354414 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.354443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.354459 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.457459 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.457514 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.457526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.457542 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.457553 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.560125 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.560182 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.560195 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.560217 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.560233 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.663700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.663753 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.663770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.663794 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.663812 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.767443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.767923 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.767942 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.767968 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.767989 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870137 4903 generic.go:334] "Generic (PLEG): container finished" podID="6b5d599c-d246-4f24-93ea-ace730325f84" containerID="17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf" exitCode=0 Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870238 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerDied","Data":"17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870370 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870400 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.870420 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.893442 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.915158 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.935371 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.956250 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.976387 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.976437 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.976453 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.976478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.976506 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:07Z","lastTransitionTime":"2025-12-02T22:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.979566 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:07 crc kubenswrapper[4903]: I1202 22:58:07.997098 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.018578 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.038696 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.072385 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.078690 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.078729 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.078742 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.078763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.078777 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.089592 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.110768 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.126796 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.143538 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.155584 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.181894 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.181935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.181947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.181965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.181976 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.300967 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.301019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.301030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.301048 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.301060 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.348307 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.348426 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.348408027 +0000 UTC m=+35.056962320 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.348530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.348631 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.348700 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.348687533 +0000 UTC m=+35.057241836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.348927 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349043 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349102 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.348558 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.349188 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349208 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.349188426 +0000 UTC m=+35.057742749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.349242 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349326 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349388 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.3493559 +0000 UTC m=+35.057910193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349443 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349489 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349515 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.349618 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.349593366 +0000 UTC m=+35.058147699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.403994 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.404061 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.404088 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.404122 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.404145 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.507525 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.507590 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.507614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.507646 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.507699 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.610838 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.610894 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.610912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.610935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.610956 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.611321 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.611462 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.611537 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.611550 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.611781 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:08 crc kubenswrapper[4903]: E1202 22:58:08.611885 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.714560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.714618 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.714639 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.714723 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.714747 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.817446 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.817535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.817587 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.817614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.817632 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.882352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.882747 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.882772 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.890520 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" event={"ID":"6b5d599c-d246-4f24-93ea-ace730325f84","Type":"ContainerStarted","Data":"629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.904997 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.923835 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.925113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.925177 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.925200 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.925229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.925252 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:08Z","lastTransitionTime":"2025-12-02T22:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.928826 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.945210 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.963063 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:08 crc kubenswrapper[4903]: I1202 22:58:08.988842 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.010149 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.028947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.028989 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.029005 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.029028 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.029044 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.035036 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.057571 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.077848 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.099474 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.132486 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.133132 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.133154 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.133163 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.133180 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.133191 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.153002 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.169975 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.185534 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.208525 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.227006 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.236116 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.236147 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.236156 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.236171 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.236182 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.239561 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.257209 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.282068 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.313626 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.332779 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.338719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.338770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.338786 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.338809 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.338828 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.354543 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.375179 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.394538 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.413538 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.433930 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.441869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.441934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.441950 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.441976 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.441993 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.455268 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.473056 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.545003 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.545063 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.545080 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.545151 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.545180 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.649257 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.649337 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.649356 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.649380 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.649397 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.752146 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.752211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.752227 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.752253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.752270 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.854926 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.854980 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.854996 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.855021 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.855041 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.895073 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.930115 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.947451 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.957839 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.957912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.957937 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.957969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.957994 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:09Z","lastTransitionTime":"2025-12-02T22:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.966803 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:09 crc kubenswrapper[4903]: I1202 22:58:09.983017 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.003739 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.026726 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.045375 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.061250 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.061357 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.061375 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.061399 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.061417 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.067602 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.087909 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.112712 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.133184 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.155200 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.165022 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.165075 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.165095 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.165122 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.165140 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.190509 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.212613 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.231831 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.268434 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.268502 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.268520 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.268545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.268562 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.372441 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.372488 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.372499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.372520 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.372534 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.474887 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.474935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.474943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.474962 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.474972 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.578779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.578831 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.578843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.578869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.578883 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.612085 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.612131 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.612193 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:10 crc kubenswrapper[4903]: E1202 22:58:10.612255 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:10 crc kubenswrapper[4903]: E1202 22:58:10.612356 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:10 crc kubenswrapper[4903]: E1202 22:58:10.612464 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.682947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.683005 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.683018 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.683037 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.683049 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.785611 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.785681 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.785698 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.785717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.785728 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.888024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.888064 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.888074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.888089 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.888101 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.990458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.990506 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.990517 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.990532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:10 crc kubenswrapper[4903]: I1202 22:58:10.990543 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:10Z","lastTransitionTime":"2025-12-02T22:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.094045 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.094114 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.094136 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.094164 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.094187 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.197266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.197332 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.197358 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.197389 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.197413 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.301577 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.301634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.301676 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.301698 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.301716 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.404964 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.405023 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.405041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.405068 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.405086 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.509324 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.509384 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.509401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.509424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.509442 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.612722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.612803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.612826 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.612858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.612881 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.633840 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.653081 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.675544 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.698942 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.716298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.716345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.716356 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.716374 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.716386 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.727971 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.746958 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.762940 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.780493 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.797352 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.816389 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.819148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.819243 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.819300 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.819399 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.819457 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.836953 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.851767 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.872085 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.889707 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.922813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.922870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.922886 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.922910 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:11 crc kubenswrapper[4903]: I1202 22:58:11.922927 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:11Z","lastTransitionTime":"2025-12-02T22:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.025549 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.025607 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.025624 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.025677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.025696 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.128169 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.128248 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.128265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.128291 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.128311 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.231717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.232157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.232365 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.232545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.232922 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.335909 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.335967 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.335983 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.336007 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.336025 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.439320 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.439383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.439401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.439437 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.439478 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.543810 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.543907 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.543927 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.543992 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.544011 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.612333 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.612352 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.612468 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:12 crc kubenswrapper[4903]: E1202 22:58:12.612702 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:12 crc kubenswrapper[4903]: E1202 22:58:12.613290 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:12 crc kubenswrapper[4903]: E1202 22:58:12.613399 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.646804 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.646866 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.646883 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.646910 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.646928 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.750313 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.750379 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.750396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.750422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.750440 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.853168 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.853232 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.853252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.853280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.853299 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.909167 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/0.log" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.913723 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb" exitCode=1 Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.913732 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.915026 4903 scope.go:117] "RemoveContainer" containerID="d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.935131 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:12Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.957039 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:12Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.958067 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.958125 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.958147 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.958177 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.958199 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:12Z","lastTransitionTime":"2025-12-02T22:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:12 crc kubenswrapper[4903]: I1202 22:58:12.982458 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:12Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.002427 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:12Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.022956 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.036603 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.053379 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.061220 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.061259 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.061270 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.061287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.061300 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.071156 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.103933 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.126026 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.150675 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.162866 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.163441 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.163488 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.163503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.163524 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.163539 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.174753 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.182729 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.266545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.266603 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.266619 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.266687 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.266714 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.369439 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.369489 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.369509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.369540 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.369561 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.473203 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.473272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.473289 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.473317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.473335 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.576999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.577057 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.577073 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.577129 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.577148 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.645722 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr"] Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.646591 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.650153 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.650184 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.666839 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.680384 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.680441 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.680458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.680485 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.680503 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.687993 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.703544 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.703710 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6f86958-173b-4746-8493-f8fe5f70a897-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.703753 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.703789 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5s5t\" (UniqueName: \"kubernetes.io/projected/f6f86958-173b-4746-8493-f8fe5f70a897-kube-api-access-g5s5t\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.706553 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.724921 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.746194 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.768831 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.783022 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.783075 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.783092 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.783117 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.783133 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.789936 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.805091 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6f86958-173b-4746-8493-f8fe5f70a897-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.805149 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.805251 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5s5t\" (UniqueName: \"kubernetes.io/projected/f6f86958-173b-4746-8493-f8fe5f70a897-kube-api-access-g5s5t\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.805344 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.806308 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.806576 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6f86958-173b-4746-8493-f8fe5f70a897-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.808516 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.818376 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6f86958-173b-4746-8493-f8fe5f70a897-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.830637 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.831240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5s5t\" (UniqueName: \"kubernetes.io/projected/f6f86958-173b-4746-8493-f8fe5f70a897-kube-api-access-g5s5t\") pod \"ovnkube-control-plane-749d76644c-xqdfr\" (UID: \"f6f86958-173b-4746-8493-f8fe5f70a897\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.850272 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.868946 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.886140 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.886175 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.886186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.886204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.886227 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.893198 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.921550 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/0.log" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.926068 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042"} Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.926721 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.933109 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.952272 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.967791 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.967792 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.989216 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.989272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.989290 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.989317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.989337 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:13Z","lastTransitionTime":"2025-12-02T22:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:13 crc kubenswrapper[4903]: W1202 22:58:13.990842 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f86958_173b_4746_8493_f8fe5f70a897.slice/crio-fc89d8184a0317f1cde2aca0c9949fa808f8b0f8a97443e808d3b5124712b53c WatchSource:0}: Error finding container fc89d8184a0317f1cde2aca0c9949fa808f8b0f8a97443e808d3b5124712b53c: Status 404 returned error can't find the container with id fc89d8184a0317f1cde2aca0c9949fa808f8b0f8a97443e808d3b5124712b53c Dec 02 22:58:13 crc kubenswrapper[4903]: I1202 22:58:13.995273 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.015577 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.040918 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.071024 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.088144 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.092317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.092368 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.092380 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.092401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.092413 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.105937 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.139678 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.179730 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.194424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.194465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.194473 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.194491 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.194500 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.197934 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.209869 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.225373 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.240866 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.250385 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.263700 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.275862 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.296885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.296960 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.296980 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.297006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.297024 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.399057 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.399109 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.399121 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.399140 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.399152 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.501743 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.501817 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.501839 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.501872 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.501895 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.604107 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.604165 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.604187 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.604212 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.604229 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.611757 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.611811 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.611765 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.611915 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.612037 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.612166 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.707278 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.707332 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.707354 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.707376 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.707395 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.800905 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.800983 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.801048 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.801083 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.801105 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.826886 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.832758 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.832820 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.832837 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.832862 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.832880 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.853175 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.858501 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.858561 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.858584 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.858619 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.858642 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.886886 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.892413 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.892471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.892489 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.892515 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.892533 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.914795 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.920135 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.920199 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.920218 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.920242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.920260 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.931639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" event={"ID":"f6f86958-173b-4746-8493-f8fe5f70a897","Type":"ContainerStarted","Data":"fc89d8184a0317f1cde2aca0c9949fa808f8b0f8a97443e808d3b5124712b53c"} Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.944248 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:14 crc kubenswrapper[4903]: E1202 22:58:14.944587 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.946512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.946552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.946569 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.946590 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:14 crc kubenswrapper[4903]: I1202 22:58:14.946608 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:14Z","lastTransitionTime":"2025-12-02T22:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.049854 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.049929 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.049954 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.049985 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.050009 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.147018 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8vx6p"] Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.147928 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.148063 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.153120 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.153188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.153210 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.153240 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.153263 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.170025 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.191293 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.213354 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.231079 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx478\" (UniqueName: \"kubernetes.io/projected/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-kube-api-access-zx478\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.231172 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.256484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.256526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.256539 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.256558 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.256570 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.258462 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.276094 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.301530 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.322057 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.332249 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx478\" (UniqueName: \"kubernetes.io/projected/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-kube-api-access-zx478\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.332368 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.332682 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.332790 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:15.832760662 +0000 UTC m=+34.541314985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.338951 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.355888 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.360885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.360929 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.360939 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.360958 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.360970 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.362040 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx478\" (UniqueName: \"kubernetes.io/projected/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-kube-api-access-zx478\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.376281 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.392053 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.408302 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.426347 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.444847 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464071 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464130 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.464413 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.490415 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.568098 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.568143 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.568154 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.568173 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.568186 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.670755 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.670811 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.670828 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.670852 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.670871 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.774316 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.774374 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.774395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.774421 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.774439 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.838494 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.838750 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.839121 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:16.839086463 +0000 UTC m=+35.547640786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.878496 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.878557 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.878581 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.878610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.878632 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.938279 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" event={"ID":"f6f86958-173b-4746-8493-f8fe5f70a897","Type":"ContainerStarted","Data":"e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.938333 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" event={"ID":"f6f86958-173b-4746-8493-f8fe5f70a897","Type":"ContainerStarted","Data":"4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.941588 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/1.log" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.942625 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/0.log" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.947475 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042" exitCode=1 Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.947555 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.947694 4903 scope.go:117] "RemoveContainer" containerID="d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.948808 4903 scope.go:117] "RemoveContainer" containerID="bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042" Dec 02 22:58:15 crc kubenswrapper[4903]: E1202 22:58:15.949102 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.965199 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.982241 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.982333 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.982362 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.982398 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.982423 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:15Z","lastTransitionTime":"2025-12-02T22:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:15 crc kubenswrapper[4903]: I1202 22:58:15.990084 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.022376 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.044158 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.064137 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087574 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087625 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087641 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087718 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087744 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.087916 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.109100 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.124487 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.143741 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.163359 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.180475 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.190595 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.190687 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.190709 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.190739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.190759 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.200362 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.217532 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.235267 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.262949 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.281100 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.294213 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.294280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.294302 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.294333 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.294355 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.296311 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.311369 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.330310 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.347400 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.365137 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.385220 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.396874 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.396904 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.396917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.396946 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.396955 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.407219 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.425558 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.443708 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.444191 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444325 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:58:32.444307773 +0000 UTC m=+51.152862056 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.444384 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.444427 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.444502 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444538 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444554 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444565 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444595 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:32.44458821 +0000 UTC m=+51.153142493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.444557 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444640 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444765 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444690 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444810 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:32.444787075 +0000 UTC m=+51.153341388 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444807 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444843 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444876 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:32.444850947 +0000 UTC m=+51.153405270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.444913 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:32.444889358 +0000 UTC m=+51.153443701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.465063 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.481029 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.500136 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.500206 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.500223 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.500249 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.500269 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.506340 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.541165 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.561550 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.579407 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.595340 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.602564 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.602689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.602705 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.602726 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.602739 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.612166 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.612193 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.612281 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.612292 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.612319 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.612518 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.612611 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.612806 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.705971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.706034 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.706056 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.706085 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.706107 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.808479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.808553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.808576 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.808604 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.808626 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.848853 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.849037 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: E1202 22:58:16.849120 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:18.849096489 +0000 UTC m=+37.557650802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.911722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.911838 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.911856 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.911880 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.911898 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:16Z","lastTransitionTime":"2025-12-02T22:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:16 crc kubenswrapper[4903]: I1202 22:58:16.951347 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/1.log" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.013933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.013987 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.014004 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.014029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.014046 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.116092 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.116169 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.116183 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.116203 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.116218 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.219325 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.219375 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.219391 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.219414 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.219432 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.322551 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.322598 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.322614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.322638 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.322686 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.425306 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.425369 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.425392 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.425422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.425445 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.528560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.528608 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.528624 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.528648 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.528705 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.631403 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.631461 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.631477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.631501 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.631517 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.734788 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.734839 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.734856 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.734879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.734896 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.838014 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.838060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.838076 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.838099 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.838115 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.941147 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.941200 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.941217 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.941244 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:17 crc kubenswrapper[4903]: I1202 22:58:17.941261 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:17Z","lastTransitionTime":"2025-12-02T22:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.044412 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.044470 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.044488 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.044515 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.044535 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.147872 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.147927 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.147943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.147968 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.147984 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.251077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.251186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.251210 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.251238 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.251258 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.354202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.354260 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.354277 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.354301 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.354318 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.457442 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.457526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.457550 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.457582 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.457610 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.560861 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.560925 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.560946 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.560976 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.561006 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.612131 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.612451 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.612527 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.612570 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.612640 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.612752 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.612756 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.612929 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.664548 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.664612 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.664632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.664689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.664718 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.767944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.768034 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.768051 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.768106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.768123 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.871619 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.871776 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.871796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.871821 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.871837 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.872155 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.872913 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:18 crc kubenswrapper[4903]: E1202 22:58:18.873045 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:22.873004438 +0000 UTC m=+41.581558751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.974169 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.974226 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.974243 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.974266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:18 crc kubenswrapper[4903]: I1202 22:58:18.974283 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:18Z","lastTransitionTime":"2025-12-02T22:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.076735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.076849 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.076876 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.076907 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.076929 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.182646 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.182761 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.182783 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.182812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.182832 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.285585 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.285633 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.285677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.285704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.285725 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.389100 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.389189 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.389211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.389252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.389277 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.491739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.491806 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.491824 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.491854 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.491874 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.594739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.594787 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.594803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.594833 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.594854 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.678141 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.693037 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.698188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.698215 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.698222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.698236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.698248 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.704608 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.714877 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.724557 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.734932 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.750043 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.762399 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.787413 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.801460 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.804372 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.804436 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.804459 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.804490 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.804513 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.818451 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.832994 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.847721 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.861207 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.881825 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.894440 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908146 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908321 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908354 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908362 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908378 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:19 crc kubenswrapper[4903]: I1202 22:58:19.908387 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:19Z","lastTransitionTime":"2025-12-02T22:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.011727 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.011791 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.011814 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.011843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.011864 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.114867 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.114943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.114972 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.115006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.115032 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.217836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.217892 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.217908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.217933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.217949 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.320411 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.320469 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.320488 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.320511 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.320527 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.424050 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.424115 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.424131 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.424157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.424175 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.526921 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.526986 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.527004 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.527029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.527049 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.611804 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.611929 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.611965 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.611840 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:20 crc kubenswrapper[4903]: E1202 22:58:20.612170 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:20 crc kubenswrapper[4903]: E1202 22:58:20.612325 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:20 crc kubenswrapper[4903]: E1202 22:58:20.612473 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:20 crc kubenswrapper[4903]: E1202 22:58:20.612689 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.631432 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.631486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.631503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.631532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.631552 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.735197 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.735264 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.735286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.735314 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.735334 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.838838 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.838900 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.838916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.838942 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.838959 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.942012 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.942146 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.942172 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.942236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:20 crc kubenswrapper[4903]: I1202 22:58:20.942256 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:20Z","lastTransitionTime":"2025-12-02T22:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.045211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.045278 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.045296 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.045320 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.045340 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.148008 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.148056 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.148070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.148091 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.148108 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.251225 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.251274 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.251286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.251305 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.251317 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.354394 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.354454 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.354472 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.354497 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.354516 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.457855 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.457916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.457931 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.457952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.457966 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.560450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.560520 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.560532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.560550 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.560562 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.635874 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.656759 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.663292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.663357 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.663371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.663395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.663407 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.673219 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.697754 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.719596 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.758733 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.765836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.765888 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.765911 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.765944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.765958 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.776559 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.791750 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.803141 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.814303 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.824330 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.839558 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.854850 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.868024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.868058 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.868070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.868089 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.868100 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.871822 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.892559 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.905447 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.972423 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.972482 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.972498 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.972525 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:21 crc kubenswrapper[4903]: I1202 22:58:21.972543 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:21Z","lastTransitionTime":"2025-12-02T22:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.075280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.075418 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.075438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.075467 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.075486 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.178372 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.178428 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.178438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.178467 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.178484 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.281634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.282559 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.282789 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.283001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.283147 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.385530 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.385897 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.386060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.386207 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.386352 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.490097 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.490191 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.490211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.490242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.490269 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.593055 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.593120 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.593137 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.593162 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.593180 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.611588 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.611687 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.611727 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.611823 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.612031 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.612208 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.612380 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.612738 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.696427 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.696487 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.696503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.696528 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.696545 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.799303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.799377 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.799395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.799421 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.799493 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.903680 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.903738 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.903750 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.903783 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.903799 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:22Z","lastTransitionTime":"2025-12-02T22:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:22 crc kubenswrapper[4903]: I1202 22:58:22.918725 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.918914 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:22 crc kubenswrapper[4903]: E1202 22:58:22.919307 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:30.918974818 +0000 UTC m=+49.627529101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.007028 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.007096 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.007113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.007141 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.007157 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.110229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.110308 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.110329 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.110359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.110381 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.214050 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.214110 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.214126 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.214152 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.214171 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.316475 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.316532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.316551 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.316603 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.316620 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.419174 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.419242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.419265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.419290 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.419308 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.522735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.522809 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.522849 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.522880 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.522902 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.625895 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.625957 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.625978 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.626006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.626025 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.728886 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.728951 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.728968 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.728994 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.729011 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.832793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.832849 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.832860 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.832895 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.832915 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.937390 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.937448 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.937458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.937479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:23 crc kubenswrapper[4903]: I1202 22:58:23.937491 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:23Z","lastTransitionTime":"2025-12-02T22:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.041298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.041905 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.042208 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.042509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.042836 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.146364 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.146431 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.146450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.146477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.146494 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.249902 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.249981 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.250006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.250036 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.250053 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.353211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.353304 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.353321 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.353346 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.353365 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.456401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.456465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.456482 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.456541 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.456559 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.559921 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.560009 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.560030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.560071 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.560093 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.612112 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.612272 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.612296 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:24 crc kubenswrapper[4903]: E1202 22:58:24.612519 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.612698 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:24 crc kubenswrapper[4903]: E1202 22:58:24.612808 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:24 crc kubenswrapper[4903]: E1202 22:58:24.612920 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:24 crc kubenswrapper[4903]: E1202 22:58:24.613116 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.663902 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.663986 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.664010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.664044 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.664066 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.767180 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.767241 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.767258 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.767292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.767316 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.870642 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.870797 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.870818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.870844 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.870860 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.973629 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.973727 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.973746 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.973773 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:24 crc kubenswrapper[4903]: I1202 22:58:24.973790 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:24Z","lastTransitionTime":"2025-12-02T22:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.030235 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.030292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.030310 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.030336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.030354 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.051894 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.057534 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.057610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.057628 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.057680 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.057699 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.076692 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.082031 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.082103 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.082130 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.082167 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.082187 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.100369 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.105287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.105344 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.105360 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.105384 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.105402 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.125755 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.130851 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.130917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.130940 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.130971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.130994 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.151626 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:25 crc kubenswrapper[4903]: E1202 22:58:25.151869 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.154072 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.154123 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.154141 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.154167 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.154186 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.257597 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.257695 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.257715 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.257743 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.257761 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.361521 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.361589 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.361632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.361696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.361726 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.465359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.465784 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.465928 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.466219 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.466371 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.569991 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.570318 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.570463 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.570604 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.570743 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.673848 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.674114 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.674177 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.674250 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.674308 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.777632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.778037 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.778177 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.778333 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.778595 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.882011 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.882069 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.882089 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.882113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.882130 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.985254 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.985368 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.985443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.985527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:25 crc kubenswrapper[4903]: I1202 22:58:25.985555 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:25Z","lastTransitionTime":"2025-12-02T22:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.089117 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.089186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.089210 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.089240 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.089262 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.192527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.192594 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.192613 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.192685 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.192705 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.296048 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.296745 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.296917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.296965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.296993 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.400088 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.400220 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.400237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.400261 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.400277 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.503136 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.503196 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.503213 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.503242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.503262 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.606206 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.606266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.606286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.606309 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.606327 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.611638 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.611756 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.611694 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.611689 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:26 crc kubenswrapper[4903]: E1202 22:58:26.611868 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:26 crc kubenswrapper[4903]: E1202 22:58:26.612035 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:26 crc kubenswrapper[4903]: E1202 22:58:26.612449 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:26 crc kubenswrapper[4903]: E1202 22:58:26.612232 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.708963 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.709024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.709042 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.709072 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.709093 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.812240 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.812314 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.812338 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.812371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.812511 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.915543 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.915642 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.915706 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.915734 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:26 crc kubenswrapper[4903]: I1202 22:58:26.915756 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:26Z","lastTransitionTime":"2025-12-02T22:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.019183 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.019273 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.019292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.019322 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.019344 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.122932 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.123015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.123039 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.123076 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.123100 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.226445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.226794 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.226885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.226935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.226960 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.331478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.331536 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.331552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.331577 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.331597 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.434954 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.435003 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.435017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.435037 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.435049 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.538386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.538453 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.538473 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.538510 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.538538 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.641850 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.641924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.641942 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.641969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.641986 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.744720 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.744769 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.744780 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.744799 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.744812 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.847979 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.848045 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.848062 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.848091 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.848110 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.950934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.950983 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.951001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.951026 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:27 crc kubenswrapper[4903]: I1202 22:58:27.951044 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:27Z","lastTransitionTime":"2025-12-02T22:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.054396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.054457 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.054480 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.054514 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.054536 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.157731 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.157799 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.157821 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.157854 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.157882 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.260929 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.261019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.261047 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.261085 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.261113 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.364727 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.364808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.364834 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.364870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.364893 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.467947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.468038 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.468070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.468106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.468126 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.571287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.571350 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.571368 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.571395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.571413 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.612371 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.612430 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.612496 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.612599 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:28 crc kubenswrapper[4903]: E1202 22:58:28.612612 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:28 crc kubenswrapper[4903]: E1202 22:58:28.612796 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:28 crc kubenswrapper[4903]: E1202 22:58:28.612926 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:28 crc kubenswrapper[4903]: E1202 22:58:28.613088 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.657789 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.669843 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.675222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.675434 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.675593 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.675802 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.676026 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.681944 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.700648 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.721981 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.743319 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.763380 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.779450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.779533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.779556 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.779591 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.779614 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.784453 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.797238 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.819949 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.844360 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3df78f2baa0b295a61e6ca8dad2e257292caf09f1fdd90fc80f0647e9900aeb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:12Z\\\",\\\"message\\\":\\\"mers/factory.go:160\\\\nI1202 22:58:11.491947 6205 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492037 6205 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492109 6205 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 22:58:11.492813 6205 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:11.492870 6205 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 22:58:11.492883 6205 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 22:58:11.492907 6205 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:11.492918 6205 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:11.492960 6205 factory.go:656] Stopping watch factory\\\\nI1202 22:58:11.492996 6205 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:11.492990 6205 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 22:58:11.493028 6205 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:11.493032 6205 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:11.493052 6205 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.861226 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.878131 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.883339 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.883402 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.883424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.883454 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.883475 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.898533 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.910620 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.924892 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.942228 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.955843 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.986410 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.986462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.986478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.986501 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:28 crc kubenswrapper[4903]: I1202 22:58:28.986520 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:28Z","lastTransitionTime":"2025-12-02T22:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.089568 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.089719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.089748 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.089778 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.089801 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.192281 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.192343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.192360 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.192387 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.192408 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.295546 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.295605 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.295621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.295647 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.295706 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.398816 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.398896 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.398919 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.398950 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.398971 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.502069 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.502156 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.502180 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.502213 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.502236 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.605809 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.605884 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.605905 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.605933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.605952 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.708220 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.708304 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.708315 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.708339 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.708351 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.810329 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.810383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.810401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.810427 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.810445 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.912735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.912768 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.912779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.912794 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:29 crc kubenswrapper[4903]: I1202 22:58:29.912805 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:29Z","lastTransitionTime":"2025-12-02T22:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.015302 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.015338 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.015348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.015363 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.015372 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.117478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.117558 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.117580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.117605 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.117624 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.220981 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.221027 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.221043 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.221066 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.221082 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.324394 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.324440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.324458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.324480 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.324497 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.427479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.427525 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.427541 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.427564 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.427580 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.530710 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.530903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.530919 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.530944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.530964 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.633339 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.633386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.633402 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.633425 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.633442 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.736737 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.736789 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.736805 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.736830 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.736846 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.840466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.840516 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.840533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.840560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.840580 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.919404 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:30 crc kubenswrapper[4903]: E1202 22:58:30.919637 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:30 crc kubenswrapper[4903]: E1202 22:58:30.919761 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:58:46.919735099 +0000 UTC m=+65.628289412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.943616 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.943704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.943727 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.943752 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:30 crc kubenswrapper[4903]: I1202 22:58:30.943772 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:30Z","lastTransitionTime":"2025-12-02T22:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.046998 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.047045 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.047060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.047085 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.047102 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.149815 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.149859 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.149875 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.149897 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.149913 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.253176 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.253221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.253237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.253260 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.253277 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.356768 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.356826 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.356846 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.356876 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.356892 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.459902 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.459958 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.459976 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.459999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.460016 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.555761 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.555863 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:31 crc kubenswrapper[4903]: E1202 22:58:31.555934 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.558872 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:31 crc kubenswrapper[4903]: E1202 22:58:31.559511 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.559598 4903 scope.go:117] "RemoveContainer" containerID="bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.559700 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:31 crc kubenswrapper[4903]: E1202 22:58:31.559822 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:31 crc kubenswrapper[4903]: E1202 22:58:31.560486 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.564446 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.564538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.564556 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.564612 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.564632 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.587987 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.610425 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.637582 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.661149 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.672947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.673003 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.673030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.673053 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.673075 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.689932 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.711709 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.730318 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.742219 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.754627 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.771482 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.778137 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.778185 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.778203 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.778227 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.778244 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.791145 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.808060 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.828152 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.844877 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.866570 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.881238 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.881276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.881288 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.881304 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.881315 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.891631 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.910738 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.927864 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.943699 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.959158 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.979393 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.983999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.984024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.984033 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.984046 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.984054 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:31Z","lastTransitionTime":"2025-12-02T22:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:31 crc kubenswrapper[4903]: I1202 22:58:31.992082 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:31Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.007489 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.027372 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.047855 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.063465 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.086468 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.086512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.086523 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.086538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.086547 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.088862 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.100992 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.120798 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.136267 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.151405 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.167907 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.187494 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.189001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.189058 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.189070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.189091 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.189103 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.210084 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.291287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.291331 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.291343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.291361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.291372 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.399766 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.399816 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.399842 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.399867 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.399885 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.501991 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.502025 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.502036 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.502054 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.502066 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.539866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.539955 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.539991 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.540009 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.540039 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540116 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540152 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 22:59:04.540116721 +0000 UTC m=+83.248671044 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540170 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540195 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:59:04.540181453 +0000 UTC m=+83.248735766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540199 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540217 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540265 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 22:59:04.540248984 +0000 UTC m=+83.248803287 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540271 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540322 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540344 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540370 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540611 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 22:59:04.540570162 +0000 UTC m=+83.249124485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:58:32 crc kubenswrapper[4903]: E1202 22:58:32.540643 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 22:59:04.540629813 +0000 UTC m=+83.249184136 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.565274 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/1.log" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.568370 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.568982 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.582971 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.595512 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.604094 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.604121 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.604129 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.604325 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.604384 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.607391 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.618767 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.629910 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.641875 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.660751 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.679073 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.694792 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.705844 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.705892 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.705934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.705958 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.705970 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.707537 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.719372 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.733799 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.750989 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.776820 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.792788 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.807146 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.808178 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.808233 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.808250 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.808275 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.808294 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.820424 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:32Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.911332 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.911394 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.911410 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.911436 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:32 crc kubenswrapper[4903]: I1202 22:58:32.911456 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:32Z","lastTransitionTime":"2025-12-02T22:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.014382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.014447 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.014470 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.014500 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.014524 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.117364 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.117410 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.117426 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.117450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.117467 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.220620 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.220719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.220737 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.220763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.220781 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.324196 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.324287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.324320 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.324351 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.324372 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.427550 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.427605 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.427621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.427643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.427686 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.530685 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.530759 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.530776 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.530803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.530822 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.575217 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/2.log" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.576494 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/1.log" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.581266 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" exitCode=1 Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.581328 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.581385 4903 scope.go:117] "RemoveContainer" containerID="bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.582399 4903 scope.go:117] "RemoveContainer" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" Dec 02 22:58:33 crc kubenswrapper[4903]: E1202 22:58:33.582873 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.612155 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.612726 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:33 crc kubenswrapper[4903]: E1202 22:58:33.612912 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.613505 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.613834 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:33 crc kubenswrapper[4903]: E1202 22:58:33.613985 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.614074 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:33 crc kubenswrapper[4903]: E1202 22:58:33.614081 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:33 crc kubenswrapper[4903]: E1202 22:58:33.614276 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.635089 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.635146 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.635168 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.635199 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.635220 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.641623 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.661914 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.680622 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.697065 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.722780 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.738319 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.738371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.738389 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.738414 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.738432 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.754364 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd7d0b3f9b9d6e630d5d840adc10f068ec60ca8d91cf9774f1b39ab12af64042\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\" at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1202 22:58:14.558878 6343 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 22:58:14.558941 6343 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.775092 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.793863 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.810551 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.829323 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.842258 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.842328 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.842347 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.842375 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.842394 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.846962 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.865768 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.881116 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.897958 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.916515 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.936524 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.945488 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.945543 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.945560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.945586 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:33 crc kubenswrapper[4903]: I1202 22:58:33.945605 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:33Z","lastTransitionTime":"2025-12-02T22:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.047974 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.048133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.048157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.048180 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.048200 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.151123 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.151191 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.151207 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.151232 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.151251 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.254363 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.254433 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.254450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.254478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.254497 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.357438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.357505 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.357522 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.357547 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.357566 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.460135 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.460186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.460202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.460225 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.460242 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.563285 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.563343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.563361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.563386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.563402 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.588021 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/2.log" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.593607 4903 scope.go:117] "RemoveContainer" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" Dec 02 22:58:34 crc kubenswrapper[4903]: E1202 22:58:34.593904 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.614095 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.630330 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.650561 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.666465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.666562 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.666581 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.666634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.666689 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.667882 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.686725 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.708407 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.728493 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.748425 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.767441 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.775297 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.775352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.775370 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.775396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.775414 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.789171 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.810295 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.830069 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.850753 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.874212 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.879414 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.879466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.879484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.879510 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.879526 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.906841 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.924157 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.942177 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.982500 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.982559 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.982576 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.982601 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:34 crc kubenswrapper[4903]: I1202 22:58:34.982625 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:34Z","lastTransitionTime":"2025-12-02T22:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.085728 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.085793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.085810 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.085835 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.085852 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.189050 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.189116 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.189134 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.189159 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.189175 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.292306 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.292351 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.292371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.292396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.292414 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.319834 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.319903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.319926 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.319993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.320017 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.341420 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.347402 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.347479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.347504 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.347976 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.348191 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.365462 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.370308 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.370359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.370378 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.370401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.370417 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.391411 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.396102 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.396167 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.396183 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.396228 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.396244 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.413176 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.418445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.418499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.418518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.418544 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.418563 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.436997 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.437251 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.439026 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.439091 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.439112 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.439137 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.439155 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.543044 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.543128 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.543152 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.543189 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.543213 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.612444 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.612500 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.612463 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.612573 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.612635 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.612856 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.612969 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:35 crc kubenswrapper[4903]: E1202 22:58:35.613058 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.646382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.646632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.646832 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.646975 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.647104 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.750275 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.750347 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.750369 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.750399 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.750424 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.853195 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.853253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.853268 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.853291 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.853307 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.955479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.955751 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.955769 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.955793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:35 crc kubenswrapper[4903]: I1202 22:58:35.955811 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:35Z","lastTransitionTime":"2025-12-02T22:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.059019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.059064 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.059080 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.059103 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.059120 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.162837 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.162965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.162985 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.163010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.163028 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.266179 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.266248 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.266266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.266292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.266309 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.369633 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.369739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.369762 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.369790 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.369811 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.473217 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.473281 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.473298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.473324 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.473342 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.576287 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.576341 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.576361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.576390 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.576407 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.680229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.680353 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.680379 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.680409 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.680433 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.783773 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.783834 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.783885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.783916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.783938 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.887228 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.887291 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.887314 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.887345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.887370 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.989992 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.990035 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.990046 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.990061 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:36 crc kubenswrapper[4903]: I1202 22:58:36.990069 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:36Z","lastTransitionTime":"2025-12-02T22:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.093628 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.093762 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.093786 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.093819 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.093842 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.197810 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.197867 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.197888 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.197913 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.197930 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.301194 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.301279 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.301295 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.301322 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.301340 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.404818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.404903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.404924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.404959 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.404992 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.508595 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.508692 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.508710 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.508737 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.508753 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.611386 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:37 crc kubenswrapper[4903]: E1202 22:58:37.611545 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.611628 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:37 crc kubenswrapper[4903]: E1202 22:58:37.611886 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.611927 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.611961 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.611977 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.612001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.612029 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:37 crc kubenswrapper[4903]: E1202 22:58:37.612124 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.612018 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.612311 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:37 crc kubenswrapper[4903]: E1202 22:58:37.612436 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.715150 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.715205 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.715222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.715247 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.715264 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.818316 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.818397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.818422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.818454 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.818479 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.921422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.921484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.921502 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.921527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:37 crc kubenswrapper[4903]: I1202 22:58:37.921545 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:37Z","lastTransitionTime":"2025-12-02T22:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.025080 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.025257 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.025288 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.025322 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.025341 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.127885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.127956 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.127968 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.127986 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.128018 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.231164 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.231211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.231229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.231253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.231270 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.334383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.334460 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.334476 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.334502 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.334519 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.438352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.438406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.438423 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.438447 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.438465 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.542185 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.542329 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.542349 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.542424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.542536 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.645018 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.645074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.645086 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.645105 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.645120 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.748456 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.748501 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.748518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.748538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.748554 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.852916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.853017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.853042 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.853077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.853103 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.955823 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.955939 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.955961 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.956032 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:38 crc kubenswrapper[4903]: I1202 22:58:38.956055 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:38Z","lastTransitionTime":"2025-12-02T22:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.060187 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.060351 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.060381 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.060455 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.060483 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.163530 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.163623 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.163689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.163771 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.163790 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.267184 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.267250 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.267267 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.267293 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.267310 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.370422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.370500 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.370525 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.370560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.370586 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.473173 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.473337 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.473361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.473397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.473420 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.576740 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.576828 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.576852 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.576885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.576907 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.611447 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.611702 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:39 crc kubenswrapper[4903]: E1202 22:58:39.611867 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.611938 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.611901 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:39 crc kubenswrapper[4903]: E1202 22:58:39.612108 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:39 crc kubenswrapper[4903]: E1202 22:58:39.612246 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:39 crc kubenswrapper[4903]: E1202 22:58:39.612331 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.681119 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.681180 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.681202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.681227 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.681244 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.784341 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.784494 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.784514 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.784540 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.784558 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.887558 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.887619 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.887636 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.887689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.887740 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.990527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.990587 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.990604 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.990625 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:39 crc kubenswrapper[4903]: I1202 22:58:39.990641 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:39Z","lastTransitionTime":"2025-12-02T22:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.094111 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.094165 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.094181 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.094205 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.094222 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.201632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.201717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.201730 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.201749 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.201767 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.304973 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.305000 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.305008 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.305021 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.305030 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.408060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.408124 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.408141 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.408165 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.408182 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.510914 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.511290 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.511477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.511621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.511984 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.614725 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.614793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.614811 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.614838 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.614856 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.718427 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.718487 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.718507 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.718535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.718553 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.821477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.821545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.821561 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.821589 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.821607 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.925289 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.925406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.925480 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.925519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:40 crc kubenswrapper[4903]: I1202 22:58:40.925592 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:40Z","lastTransitionTime":"2025-12-02T22:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.028994 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.029080 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.029097 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.029121 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.029138 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.132803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.132881 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.132909 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.132938 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.132956 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.236282 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.236347 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.236363 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.236390 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.236410 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.339412 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.339491 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.339516 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.339548 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.339571 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.443129 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.443234 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.443252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.443276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.443293 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.546520 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.546589 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.546608 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.546633 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.546722 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.611368 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.611398 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.611439 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:41 crc kubenswrapper[4903]: E1202 22:58:41.611543 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.611737 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:41 crc kubenswrapper[4903]: E1202 22:58:41.611864 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:41 crc kubenswrapper[4903]: E1202 22:58:41.612066 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:41 crc kubenswrapper[4903]: E1202 22:58:41.612307 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.635118 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.649732 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.649978 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.650000 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.650029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.650049 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.656944 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.678556 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.701887 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.726109 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.745759 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.757578 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.757624 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.757641 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.757689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.757708 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.768646 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.799693 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.817192 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.833374 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.850628 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.860560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.860605 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.860621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.860646 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.860700 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.869804 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.887203 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.906893 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.924781 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.945584 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.978180 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:41Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.991950 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.991995 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.992006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.992024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:41 crc kubenswrapper[4903]: I1202 22:58:41.992036 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:41Z","lastTransitionTime":"2025-12-02T22:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.094359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.094396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.094406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.094422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.094431 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.198290 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.198358 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.198381 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.198410 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.198429 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.301685 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.301743 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.301761 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.301785 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.301806 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.405700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.405773 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.405796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.405822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.405839 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.509127 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.509462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.509486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.509512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.509530 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.613055 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.613153 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.613178 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.613208 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.613232 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.715820 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.715898 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.715924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.715955 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.715978 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.818904 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.818950 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.818961 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.818982 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.818994 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.921157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.921236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.921260 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.921290 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:42 crc kubenswrapper[4903]: I1202 22:58:42.921313 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:42Z","lastTransitionTime":"2025-12-02T22:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.023783 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.023862 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.023884 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.023913 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.023932 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.126937 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.126999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.127017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.127045 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.127063 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.229600 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.229704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.229722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.229749 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.229767 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.333152 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.333230 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.333253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.333288 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.333310 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.436726 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.436795 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.436819 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.436847 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.436868 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.538737 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.538788 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.538808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.538832 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.538849 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.611815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.611941 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.611827 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.612079 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:43 crc kubenswrapper[4903]: E1202 22:58:43.612068 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:43 crc kubenswrapper[4903]: E1202 22:58:43.612233 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:43 crc kubenswrapper[4903]: E1202 22:58:43.612349 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:43 crc kubenswrapper[4903]: E1202 22:58:43.612584 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.642106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.642215 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.642237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.642264 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.642283 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.745770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.745990 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.746010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.746036 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.746054 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.849560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.849626 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.849671 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.849700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.849718 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.952682 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.952732 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.952749 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.952772 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:43 crc kubenswrapper[4903]: I1202 22:58:43.952790 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:43Z","lastTransitionTime":"2025-12-02T22:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.056361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.056427 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.056443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.056469 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.056487 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.159143 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.159202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.159219 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.159244 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.159261 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.262906 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.262995 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.263019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.263142 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.263161 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.366830 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.366891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.366908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.366933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.366950 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.469913 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.470005 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.470024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.470052 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.470071 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.573321 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.573394 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.573412 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.573440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.573459 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.676523 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.676573 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.676584 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.676602 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.676616 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.782893 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.782953 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.782964 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.782982 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.782996 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.885865 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.885899 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.885907 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.885920 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.885929 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.988277 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.988320 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.988328 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.988343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:44 crc kubenswrapper[4903]: I1202 22:58:44.988352 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:44Z","lastTransitionTime":"2025-12-02T22:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.090569 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.090599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.090607 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.090621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.090629 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.193247 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.193300 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.193310 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.193326 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.193336 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.296157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.296202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.296212 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.296229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.296241 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.398509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.398565 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.398582 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.398606 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.398624 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.502069 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.502115 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.502126 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.502145 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.502157 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.605090 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.605149 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.605166 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.605188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.605203 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.611860 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.611918 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.611924 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.612041 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.612063 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.612201 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.612340 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.612476 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.627179 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.627221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.627233 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.627252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.627265 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.645177 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.650978 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.651037 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.651047 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.651067 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.651353 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.671586 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.676303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.676371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.676382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.676401 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.676433 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.689925 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.695735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.695814 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.695836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.695867 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.695897 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.715203 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.719592 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.719702 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.719728 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.719759 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.719780 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.735126 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:45 crc kubenswrapper[4903]: E1202 22:58:45.735249 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.736935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.736959 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.736969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.736985 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.736996 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.840268 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.840326 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.840342 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.840367 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.840385 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.942936 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.942990 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.943007 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.943033 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:45 crc kubenswrapper[4903]: I1202 22:58:45.943050 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:45Z","lastTransitionTime":"2025-12-02T22:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.045471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.045524 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.045535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.045549 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.045558 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.148202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.148251 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.148262 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.148279 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.148291 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.250454 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.250497 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.250506 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.250522 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.250531 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.352580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.352643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.352696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.352722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.352739 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.454913 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.454969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.455030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.455056 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.455073 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.558018 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.558056 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.558066 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.558082 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.558095 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.612812 4903 scope.go:117] "RemoveContainer" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" Dec 02 22:58:46 crc kubenswrapper[4903]: E1202 22:58:46.613189 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.660519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.660593 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.660610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.660633 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.660645 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.762877 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.762925 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.762940 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.762959 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.762972 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.865152 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.865207 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.865221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.865237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.865248 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.967580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.967617 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.967627 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.967643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:46 crc kubenswrapper[4903]: I1202 22:58:46.967668 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:46Z","lastTransitionTime":"2025-12-02T22:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.013617 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.013798 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.013862 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 22:59:19.013846132 +0000 UTC m=+97.722400415 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.070405 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.070465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.070483 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.070506 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.070526 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.173484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.173547 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.173564 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.173591 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.173611 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.276552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.276628 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.276645 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.276707 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.276728 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.379747 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.379818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.379841 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.379869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.379889 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.482644 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.482748 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.482771 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.482796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.482812 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.585631 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.585720 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.585740 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.585768 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.585791 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.612075 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.612163 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.612246 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.612299 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.612285 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.612391 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.612492 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:47 crc kubenswrapper[4903]: E1202 22:58:47.612627 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.688845 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.688882 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.688893 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.688908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.688917 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.792015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.792083 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.792105 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.792134 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.792155 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.896035 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.896104 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.896129 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.896159 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.896179 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.998596 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.998670 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.998689 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.998712 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:47 crc kubenswrapper[4903]: I1202 22:58:47.998728 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:47Z","lastTransitionTime":"2025-12-02T22:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.101900 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.101974 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.101989 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.102006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.102018 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.204397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.204456 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.204474 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.204499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.204517 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.306917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.306971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.306983 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.307003 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.307016 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.409542 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.409588 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.409599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.409622 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.409635 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.512577 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.512631 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.512679 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.512706 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.512723 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.615024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.615073 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.615085 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.615102 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.615113 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.644225 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/0.log" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.644283 4903 generic.go:334] "Generic (PLEG): container finished" podID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" containerID="4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b" exitCode=1 Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.644318 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerDied","Data":"4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.644753 4903 scope.go:117] "RemoveContainer" containerID="4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.657099 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.669234 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.683587 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.699405 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.712811 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.716677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.716705 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.716713 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.716729 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.716740 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.727004 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.741300 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.756389 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.771617 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.793334 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819442 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819495 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819523 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.819041 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.837756 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.853189 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.867610 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.880715 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.891922 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.907205 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.922455 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.922491 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.922500 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.922516 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:48 crc kubenswrapper[4903]: I1202 22:58:48.922551 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:48Z","lastTransitionTime":"2025-12-02T22:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.025308 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.025357 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.025370 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.025389 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.025401 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.128068 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.128138 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.128151 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.128168 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.128200 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.230806 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.230859 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.230872 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.230893 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.230905 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.333595 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.333644 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.333687 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.333707 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.333719 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.436252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.436302 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.436317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.436338 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.436355 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.539010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.539068 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.539077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.539095 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.539106 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.612182 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.612213 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.612313 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.612194 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:49 crc kubenswrapper[4903]: E1202 22:58:49.612330 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:49 crc kubenswrapper[4903]: E1202 22:58:49.612438 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:49 crc kubenswrapper[4903]: E1202 22:58:49.612529 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:49 crc kubenswrapper[4903]: E1202 22:58:49.612581 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.641818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.641853 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.641863 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.641879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.641890 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.649701 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/0.log" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.649790 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerStarted","Data":"940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.666409 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.682052 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.703354 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.717225 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.730806 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.743739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.743780 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.743792 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.743812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.743824 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.745092 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.756144 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.767859 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.779048 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.792732 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.807101 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.817629 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.832539 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.845261 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.847279 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.847360 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.847375 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.847392 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.847427 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.858900 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.873743 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.887333 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.949924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.949968 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.949980 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.950001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:49 crc kubenswrapper[4903]: I1202 22:58:49.950013 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:49Z","lastTransitionTime":"2025-12-02T22:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.052204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.052236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.052245 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.052257 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.052266 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.153970 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.154002 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.154010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.154023 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.154033 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.257325 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.257405 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.257431 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.257463 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.257486 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.360844 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.360902 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.360918 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.360942 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.360959 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.464195 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.464241 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.464251 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.464266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.464279 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.567087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.567126 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.567138 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.567184 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.567195 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.670455 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.670518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.670535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.670559 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.670576 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.773726 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.773778 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.773789 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.773807 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.773819 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.876855 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.876901 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.876912 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.876934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.876949 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.979274 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.979329 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.979341 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.979359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:50 crc kubenswrapper[4903]: I1202 22:58:50.979371 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:50Z","lastTransitionTime":"2025-12-02T22:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.081838 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.081888 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.081903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.081923 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.081936 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.184426 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.184471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.184483 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.184504 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.184517 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.287998 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.288037 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.288047 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.288060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.288069 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.390449 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.390523 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.390538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.390566 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.390580 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.493729 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.493786 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.493796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.493815 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.493833 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.597113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.597196 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.597221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.597253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.597276 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.611434 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.611475 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.611818 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.611883 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:51 crc kubenswrapper[4903]: E1202 22:58:51.611978 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:51 crc kubenswrapper[4903]: E1202 22:58:51.612254 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:51 crc kubenswrapper[4903]: E1202 22:58:51.612390 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:51 crc kubenswrapper[4903]: E1202 22:58:51.612566 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.623894 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.626044 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.637152 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.652426 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.665684 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.680434 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.698873 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.699312 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.699337 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.699345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.699358 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.699368 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.720850 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.735142 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.750327 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.768296 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.783248 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.797566 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.801156 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.801179 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.801188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.801204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.801218 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.814066 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.827627 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.841305 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.855985 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.870150 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:51Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.903742 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.903766 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.903777 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.903796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:51 crc kubenswrapper[4903]: I1202 22:58:51.903808 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:51Z","lastTransitionTime":"2025-12-02T22:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.005823 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.005862 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.005874 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.005889 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.005901 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.108286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.108332 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.108342 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.108360 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.108371 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.210711 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.210776 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.210793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.210822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.210842 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.313542 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.313584 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.313593 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.313610 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.313620 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.415696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.415748 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.415765 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.415790 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.415806 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.518271 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.518317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.518329 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.518352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.518363 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.621263 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.621300 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.621309 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.621324 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.621333 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.723869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.723948 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.723957 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.723978 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.723989 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.826995 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.827062 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.827074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.827094 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.827105 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.930303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.930383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.930403 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.930433 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:52 crc kubenswrapper[4903]: I1202 22:58:52.930456 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:52Z","lastTransitionTime":"2025-12-02T22:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.037476 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.037529 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.037538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.037551 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.037562 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.140337 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.140391 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.140404 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.140424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.140436 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.242981 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.243079 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.243092 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.243115 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.243130 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.345542 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.345600 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.345616 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.345637 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.345683 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.449370 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.449451 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.449473 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.449503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.449524 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.552625 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.552704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.552720 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.552745 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.552759 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.612300 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.612333 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.612383 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.612427 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:53 crc kubenswrapper[4903]: E1202 22:58:53.612521 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:53 crc kubenswrapper[4903]: E1202 22:58:53.612590 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:53 crc kubenswrapper[4903]: E1202 22:58:53.612796 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:53 crc kubenswrapper[4903]: E1202 22:58:53.612876 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.655404 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.655465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.655484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.655507 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.655524 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.757997 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.758041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.758052 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.758068 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.758078 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.860889 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.860954 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.860972 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.860999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.861019 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.963060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.963120 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.963143 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.963175 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:53 crc kubenswrapper[4903]: I1202 22:58:53.963197 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:53Z","lastTransitionTime":"2025-12-02T22:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.066445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.066502 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.066519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.066545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.066562 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.169113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.169158 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.169169 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.169186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.169198 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.271113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.271153 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.271163 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.271179 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.271191 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.372984 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.373024 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.373032 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.373046 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.373057 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.475031 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.475065 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.475074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.475087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.475097 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.577352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.577426 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.577443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.577467 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.577484 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.680481 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.680534 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.680545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.680563 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.680576 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.783196 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.783258 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.783282 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.783310 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.783330 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.885387 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.885474 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.885487 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.885504 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.885517 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.987841 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.987908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.987925 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.987950 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:54 crc kubenswrapper[4903]: I1202 22:58:54.987970 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:54Z","lastTransitionTime":"2025-12-02T22:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.091174 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.091234 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.091250 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.091276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.091294 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.193761 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.193797 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.193805 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.193820 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.193831 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.297553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.297590 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.297598 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.297614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.297622 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.400023 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.400114 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.400127 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.400150 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.400163 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.503382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.503452 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.503474 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.503503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.503523 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.607031 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.607089 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.607106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.607133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.607151 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.612423 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.612458 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.612502 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.612456 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.612571 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.612685 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.612778 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.612921 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.709636 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.709715 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.709734 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.709757 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.709773 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.812787 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.812855 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.812870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.812897 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.812914 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.915944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.915984 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.915993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.916013 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.916025 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.948338 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.948385 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.948397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.948417 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.948430 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.964593 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.969348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.969440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.969547 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.969576 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.969629 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:55 crc kubenswrapper[4903]: E1202 22:58:55.985001 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.989591 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.989643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.989673 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.989696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:55 crc kubenswrapper[4903]: I1202 22:58:55.989709 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:55Z","lastTransitionTime":"2025-12-02T22:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: E1202 22:58:56.007620 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.012479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.012521 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.012534 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.012552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.012564 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: E1202 22:58:56.030674 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.034895 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.034939 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.034952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.034970 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.034982 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: E1202 22:58:56.054371 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:58:56Z is after 2025-08-24T17:21:41Z" Dec 02 22:58:56 crc kubenswrapper[4903]: E1202 22:58:56.054544 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.056296 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.056333 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.056346 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.056362 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.056376 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.159149 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.159192 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.159204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.159220 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.159232 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.262235 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.262297 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.262318 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.262347 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.262371 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.365730 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.365885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.365908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.365933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.365955 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.469187 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.469254 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.469272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.469298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.469316 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.572531 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.572581 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.572593 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.572612 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.572628 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.674932 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.674981 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.674992 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.675012 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.675025 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.776949 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.777015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.777032 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.777057 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.777073 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.879194 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.879251 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.879267 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.879291 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.879308 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.982736 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.982779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.982795 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.982844 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:56 crc kubenswrapper[4903]: I1202 22:58:56.982863 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:56Z","lastTransitionTime":"2025-12-02T22:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.085076 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.085145 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.085157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.085174 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.085186 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.187608 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.187677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.187686 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.187700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.187711 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.290465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.290518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.290535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.290552 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.290564 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.393703 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.393763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.393783 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.393806 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.393823 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.496835 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.496891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.496907 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.496931 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.496948 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.603462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.603506 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.603518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.603535 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.603562 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.611475 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:57 crc kubenswrapper[4903]: E1202 22:58:57.611602 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.611616 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.611677 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.611719 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:57 crc kubenswrapper[4903]: E1202 22:58:57.611863 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:57 crc kubenswrapper[4903]: E1202 22:58:57.611935 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:57 crc kubenswrapper[4903]: E1202 22:58:57.612034 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.705807 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.705876 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.705901 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.705935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.705958 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.808864 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.808933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.808948 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.808973 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.808992 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.911925 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.911988 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.912009 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.912039 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:57 crc kubenswrapper[4903]: I1202 22:58:57.912061 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:57Z","lastTransitionTime":"2025-12-02T22:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.015041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.015096 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.015108 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.015126 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.015141 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.117798 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.117871 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.117891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.117921 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.117944 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.220776 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.220851 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.220873 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.220906 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.220927 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.323578 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.323626 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.323642 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.323709 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.323726 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.425976 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.426013 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.426030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.426046 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.426056 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.528301 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.528372 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.528396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.528428 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.528451 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.631071 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.631111 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.631121 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.631136 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.631146 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.734533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.734580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.734592 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.734608 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.734619 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.837361 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.837422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.837438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.837465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.837482 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.940939 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.940999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.941016 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.941038 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:58 crc kubenswrapper[4903]: I1202 22:58:58.941056 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:58Z","lastTransitionTime":"2025-12-02T22:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.043833 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.043891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.043979 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.044010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.044027 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.147029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.147076 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.147093 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.147117 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.147135 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.249093 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.249126 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.249134 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.249149 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.249157 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.352467 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.352524 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.352543 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.352569 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.352585 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.455245 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.455312 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.455330 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.455355 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.455374 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.559199 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.559265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.559276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.559298 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.559312 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.611684 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:58:59 crc kubenswrapper[4903]: E1202 22:58:59.611905 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.611914 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.612001 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.612047 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:58:59 crc kubenswrapper[4903]: E1202 22:58:59.612548 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:58:59 crc kubenswrapper[4903]: E1202 22:58:59.612721 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:58:59 crc kubenswrapper[4903]: E1202 22:58:59.612916 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.612979 4903 scope.go:117] "RemoveContainer" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.663354 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.663406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.663421 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.663445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.663462 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.766625 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.766764 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.766813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.766898 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.766923 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.869060 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.869111 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.869124 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.869142 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.869156 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.972674 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.972717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.972729 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.972744 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:58:59 crc kubenswrapper[4903]: I1202 22:58:59.972756 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:58:59Z","lastTransitionTime":"2025-12-02T22:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.076305 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.076367 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.076381 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.076402 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.076417 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.180036 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.180415 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.180561 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.180754 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.180909 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.283898 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.284248 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.284389 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.284532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.284684 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.387213 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.387269 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.387285 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.387313 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.387329 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.490055 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.490108 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.490125 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.490148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.490167 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.593348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.593395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.593405 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.593426 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.593630 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.696710 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.696779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.696797 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.696825 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.696846 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.799785 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.799839 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.799856 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.799883 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.799899 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.902770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.902852 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.902874 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.902903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:00 crc kubenswrapper[4903]: I1202 22:59:00.902923 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:00Z","lastTransitionTime":"2025-12-02T22:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.007989 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.008041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.008054 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.008075 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.008086 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.110596 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.110691 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.110709 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.110728 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.110746 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.213813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.213880 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.213892 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.213916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.213929 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.317935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.317984 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.317996 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.318015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.318026 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.422492 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.422588 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.422644 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.422729 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.422758 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.526531 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.526586 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.526626 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.526662 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.526700 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.611440 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.611446 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.611457 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.611608 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:01 crc kubenswrapper[4903]: E1202 22:59:01.611733 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:01 crc kubenswrapper[4903]: E1202 22:59:01.611929 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:01 crc kubenswrapper[4903]: E1202 22:59:01.612054 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:01 crc kubenswrapper[4903]: E1202 22:59:01.612147 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.630069 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.630136 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.630151 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.630170 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.630183 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.636131 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.653459 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.674306 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.697358 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/2.log" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.701163 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.701725 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.712108 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.732312 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.732357 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.732368 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.732384 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.732395 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.735837 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.750314 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.761343 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.779601 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.793456 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.807168 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.820282 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835203 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835269 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835297 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.835959 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.854875 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.865695 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.875834 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.887743 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d5b727-e078-49d6-8652-555f87754b11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.905268 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.918429 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.936951 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.938740 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.938786 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.938800 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.938821 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.938835 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:01Z","lastTransitionTime":"2025-12-02T22:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.950692 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d5b727-e078-49d6-8652-555f87754b11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.961357 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.973601 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:01 crc kubenswrapper[4903]: I1202 22:59:01.984463 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.003078 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:01Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.026835 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.038932 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.040813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.040858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.040871 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.040889 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.040901 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.053917 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.070018 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.081825 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.095194 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.114039 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.143532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.143569 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.143578 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.143592 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.143601 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.147978 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.166007 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.177940 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.187053 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.197725 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.246274 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.246323 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.246331 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.246348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.246358 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.349818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.349869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.349881 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.349903 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.349920 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.452353 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.452409 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.452423 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.452445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.452461 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.555110 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.555155 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.555168 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.555185 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.555196 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.658005 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.658477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.658490 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.658514 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.658529 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.708995 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/3.log" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.714306 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/2.log" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.721470 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" exitCode=1 Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.721537 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.721634 4903 scope.go:117] "RemoveContainer" containerID="1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.722181 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:02 crc kubenswrapper[4903]: E1202 22:59:02.722367 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.740685 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d5b727-e078-49d6-8652-555f87754b11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.759646 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.761392 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.761442 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.761458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.761482 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.761501 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.776538 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.796579 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.817989 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.832343 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.853778 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.864406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.864461 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.864474 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.864494 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.864508 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.869265 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.894132 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.913906 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.933765 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.952733 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.968045 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.968103 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.968119 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.968145 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.968162 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:02Z","lastTransitionTime":"2025-12-02T22:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.984363 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b8ab57f97826cc97060179fa188f695ed495aca141f6555880e4e902409fd14\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:32Z\\\",\\\"message\\\":\\\"sNode crc took: 1.707692ms\\\\nI1202 22:58:32.666622 6544 factory.go:1336] Added *v1.Node event handler 7\\\\nI1202 22:58:32.666669 6544 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1202 22:58:32.666724 6544 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 22:58:32.666783 6544 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 22:58:32.666858 6544 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.666863 6544 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 22:58:32.666933 6544 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 22:58:32.666946 6544 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 22:58:32.666917 6544 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1202 22:58:32.666990 6544 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 22:58:32.667040 6544 factory.go:656] Stopping watch factory\\\\nI1202 22:58:32.667089 6544 ovnkube.go:599] Stopped ovnkube\\\\nI1202 22:58:32.667043 6544 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 22:58:32.667055 6544 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 22:58:32.667164 6544 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 22:58:32.667375 6544 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:59:02Z\\\",\\\"message\\\":\\\"ernal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 22:59:02.134707 6885 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1202 22:59:02.134793 6885 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:02 crc kubenswrapper[4903]: I1202 22:59:02.999510 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:02Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.015548 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.031394 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.044199 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.058120 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.071649 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.071739 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.071762 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.071799 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.071818 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.175827 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.175899 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.175917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.175944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.175962 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.279704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.279749 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.279758 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.279778 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.279789 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.382574 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.382685 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.382723 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.382745 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.382759 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.485822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.485875 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.485889 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.485910 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.485921 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.588760 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.588824 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.588836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.588854 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.588867 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.611420 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.611506 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.611525 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.611470 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:03 crc kubenswrapper[4903]: E1202 22:59:03.611744 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:03 crc kubenswrapper[4903]: E1202 22:59:03.611904 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:03 crc kubenswrapper[4903]: E1202 22:59:03.611990 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:03 crc kubenswrapper[4903]: E1202 22:59:03.612056 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.691684 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.691735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.691746 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.691763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.691772 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.726985 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/3.log" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.730686 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:03 crc kubenswrapper[4903]: E1202 22:59:03.730898 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.744958 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d5b727-e078-49d6-8652-555f87754b11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.757916 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.770700 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.783367 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.794594 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.794633 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.794645 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.794677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.794690 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.795509 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.805410 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.820957 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.839962 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.860767 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.891105 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.897625 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.897690 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.897703 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.897722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.897735 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:03Z","lastTransitionTime":"2025-12-02T22:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.916389 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.951378 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:59:02Z\\\",\\\"message\\\":\\\"ernal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 22:59:02.134707 6885 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1202 22:59:02.134793 6885 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:59:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.968805 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:03 crc kubenswrapper[4903]: I1202 22:59:03.985361 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:03Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.000323 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.000431 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.000452 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.000475 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.000487 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.004983 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.031040 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.043144 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.057058 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.103451 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.103527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.103560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.103582 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.103595 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.207204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.207294 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.207319 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.207355 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.207379 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.309630 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.309746 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.309809 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.309843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.309866 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.413359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.413457 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.413484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.413520 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.413546 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.516944 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.517088 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.517110 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.517141 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.517162 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.611633 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.611877 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.611845716 +0000 UTC m=+147.320400029 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.612021 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.612091 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.612179 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.612221 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612341 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612387 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612417 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612445 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612465 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612468 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.612436041 +0000 UTC m=+147.320990364 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612559 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.612533863 +0000 UTC m=+147.321088176 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612594 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.612582855 +0000 UTC m=+147.321137168 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612922 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612969 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.612995 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:59:04 crc kubenswrapper[4903]: E1202 22:59:04.613091 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.613063916 +0000 UTC m=+147.321618239 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.620101 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.620178 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.620204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.620237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.620258 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.723036 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.723090 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.723101 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.723120 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.723132 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.826147 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.826203 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.826215 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.826236 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.826251 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.929991 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.930051 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.930065 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.930087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:04 crc kubenswrapper[4903]: I1202 22:59:04.930101 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:04Z","lastTransitionTime":"2025-12-02T22:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.033001 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.033058 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.033072 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.033091 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.033103 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.135993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.136048 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.136062 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.136082 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.136094 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.238908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.238972 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.238984 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.239004 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.239017 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.343138 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.343206 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.343226 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.343253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.343272 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.447171 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.447222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.447239 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.447264 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.447276 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.550949 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.551040 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.551059 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.551090 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.551111 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.612516 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:05 crc kubenswrapper[4903]: E1202 22:59:05.612761 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.612780 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.612815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:05 crc kubenswrapper[4903]: E1202 22:59:05.612948 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:05 crc kubenswrapper[4903]: E1202 22:59:05.613055 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.613768 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:05 crc kubenswrapper[4903]: E1202 22:59:05.613872 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.654945 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.655020 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.655041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.655070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.655094 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.758526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.758573 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.758587 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.758604 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.758615 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.861445 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.861512 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.861529 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.861561 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.861580 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.963990 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.964031 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.964040 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.964055 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:05 crc kubenswrapper[4903]: I1202 22:59:05.964065 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:05Z","lastTransitionTime":"2025-12-02T22:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.067334 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.067418 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.067437 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.067477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.067497 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.170835 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.170924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.170948 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.170979 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.171001 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.273909 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.273956 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.273971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.273991 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.274005 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.347465 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.347519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.347536 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.347561 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.347578 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.369927 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.375455 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.375499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.375509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.375526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.375536 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.393034 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.397896 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.397959 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.397982 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.398014 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.398040 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.417624 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.422600 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.422691 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.422709 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.422735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.422753 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.441864 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.446466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.446523 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.446546 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.446607 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.446633 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.461673 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T22:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5eef24b0-10c3-4ee6-bf4d-784c2d2e5050\\\",\\\"systemUUID\\\":\\\"3787324b-4c61-413b-8321-e9e2f283e2ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:06 crc kubenswrapper[4903]: E1202 22:59:06.461819 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.463484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.463516 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.463526 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.463541 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.463550 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.566446 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.566518 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.566539 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.566564 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.566582 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.668983 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.669052 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.669077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.669106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.669131 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.771717 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.771788 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.771812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.771847 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.771873 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.874539 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.874589 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.874607 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.874627 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.874641 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.977582 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.977629 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.977638 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.977683 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:06 crc kubenswrapper[4903]: I1202 22:59:06.977693 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:06Z","lastTransitionTime":"2025-12-02T22:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.079908 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.079993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.080016 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.080049 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.080072 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.183000 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.183044 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.183052 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.183070 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.183080 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.284943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.284989 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.284999 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.285018 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.285031 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.388080 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.388150 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.388166 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.388194 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.388215 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.491896 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.491971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.491990 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.492020 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.492040 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.595498 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.595560 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.595581 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.595613 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.595632 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.612075 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.612075 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.612158 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.612205 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:07 crc kubenswrapper[4903]: E1202 22:59:07.612378 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:07 crc kubenswrapper[4903]: E1202 22:59:07.612577 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:07 crc kubenswrapper[4903]: E1202 22:59:07.612738 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:07 crc kubenswrapper[4903]: E1202 22:59:07.612878 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.699332 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.699406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.699425 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.699455 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.699473 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.801977 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.802030 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.802041 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.802058 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.802070 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.904263 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.904634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.907740 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.907790 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:07 crc kubenswrapper[4903]: I1202 22:59:07.907803 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:07Z","lastTransitionTime":"2025-12-02T22:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.010133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.010230 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.010251 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.010316 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.010334 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.113620 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.113792 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.113824 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.113860 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.113884 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.217551 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.217621 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.217646 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.217719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.217744 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.321240 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.321303 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.321320 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.321345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.321361 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.424865 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.424933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.424956 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.424986 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.425010 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.528038 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.528093 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.528109 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.528133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.528153 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.631634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.631742 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.631766 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.631791 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.631809 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.735316 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.735378 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.735397 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.735421 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.735441 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.838684 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.838756 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.838774 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.838803 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.839162 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.942690 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.942769 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.942791 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.942822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:08 crc kubenswrapper[4903]: I1202 22:59:08.942845 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:08Z","lastTransitionTime":"2025-12-02T22:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.045770 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.045814 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.045827 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.045844 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.045857 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.148271 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.148322 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.148336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.148358 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.148373 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.252084 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.252145 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.252166 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.252202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.252224 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.354894 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.354952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.354969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.354996 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.355014 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.458824 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.458899 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.458918 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.458947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.458966 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.561726 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.561797 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.561819 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.561852 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.561873 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.611565 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.612006 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.612081 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.612113 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:09 crc kubenswrapper[4903]: E1202 22:59:09.612114 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:09 crc kubenswrapper[4903]: E1202 22:59:09.612287 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:09 crc kubenswrapper[4903]: E1202 22:59:09.612293 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:09 crc kubenswrapper[4903]: E1202 22:59:09.612487 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.630043 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.664883 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.664917 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.664926 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.664939 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.664948 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.767277 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.767336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.767352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.767380 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.767397 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.870760 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.870826 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.870843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.870870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.870893 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.974388 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.974457 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.974478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.974503 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:09 crc kubenswrapper[4903]: I1202 22:59:09.974520 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:09Z","lastTransitionTime":"2025-12-02T22:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.077905 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.077975 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.077997 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.078026 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.078048 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.180568 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.180614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.180629 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.180680 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.180704 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.283188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.283265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.283288 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.283316 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.283354 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.385730 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.385774 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.385784 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.385804 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.385815 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.488569 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.488601 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.488618 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.488636 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.488646 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.591074 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.591168 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.591213 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.591243 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.591263 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.694771 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.694865 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.694891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.694924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.694962 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.798437 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.798509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.798532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.798563 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.798585 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.901933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.901989 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.902005 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.902029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:10 crc kubenswrapper[4903]: I1202 22:59:10.902046 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:10Z","lastTransitionTime":"2025-12-02T22:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.005458 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.005635 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.005682 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.005704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.005717 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.109051 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.109109 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.109127 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.109151 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.109167 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.211614 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.211684 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.211696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.211714 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.211727 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.315090 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.315158 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.315181 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.315212 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.315235 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.417471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.417519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.417538 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.417563 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.417579 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.521017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.521093 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.521111 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.521138 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.521156 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.612047 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.612192 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.612302 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:11 crc kubenswrapper[4903]: E1202 22:59:11.612282 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:11 crc kubenswrapper[4903]: E1202 22:59:11.612486 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:11 crc kubenswrapper[4903]: E1202 22:59:11.612782 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.612823 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:11 crc kubenswrapper[4903]: E1202 22:59:11.612954 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.623252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.623484 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.623634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.623860 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.624007 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.624427 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"823a3cb9-fe97-46de-8791-375103e2ec0e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242c60254f2b4a3c9e849384d813e56c75cf2f60705e0b1e7ca2707663b3f029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7dac854f7481fe71dcc8976f658e52ad759ca0faf9205d50b1bec7a28575c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://716e5b24fb81d3b6e884f43a9299fbe4397d133daa4727c659c633416c14bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.637329 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1607bcf-a910-4737-ba76-e982301faac5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075bd1184d0ced61c9610e5961f455627fc04f846239edd3b10032dd44fab539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e8053e7bef3d1577c025c515ad48920fcb7077167f8fc98b147d6499e38a23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628d292a1edf1c1db86af8057d8f7c50cbd2c45477075ef846a24d7cfd444bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21835717fd04a4f8fbd1fce3b004c111edc3d93408e2921440877b3bd652fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.655405 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcd203e123cee3c188516771a97ae3251c12b3c358dcc7585f82d4922596c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.672813 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a61fe87c04428349b01fa0cdd845575f6c85ac337d9b924f72578ebeb415b1af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bd1e700c2378f152c8b9d0557c41073b26155c0b3f8debf23ba626122323957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.689016 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.704473 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s4nbg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:58:47Z\\\",\\\"message\\\":\\\"2025-12-02T22:58:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20\\\\n2025-12-02T22:58:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66be8414-3bdb-4cca-a6ce-7b817c24bc20 to /host/opt/cni/bin/\\\\n2025-12-02T22:58:02Z [verbose] multus-daemon started\\\\n2025-12-02T22:58:02Z [verbose] Readiness Indicator file check\\\\n2025-12-02T22:58:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cz2mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s4nbg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.719047 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b5d599c-d246-4f24-93ea-ace730325f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629037fd709058c15b3b4a4560aaecb34e6368ab243dcf573001b483a3033ce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587832795d80c8abab1c45d0a392e5809fc3d1d4be0f2975279a9e771a4df5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19aa0ef49161541e6a98a3c4503f4a723b60a8c8d680159a80919f1376684894\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5eb30780f1acf725d10b8926af069d1ed313ea0df2127866ace2c44bffd5e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d424053656391d08a7430e7a4a7c3b504bc95d7c15b5d57f33fc9fb4d3b92d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fef34739c8e908ce913c8b41b53123db6abad8260cc53a9d202d62faeffd254e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f197e2c4eecfd853f489fc2fc1f1db64965ce640beb623c2b8cb4dd16b2fdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8jjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tjcvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.726996 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.727029 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.727038 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.727053 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.727065 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.751861 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ab90b8-4bb9-418c-8b55-19c4c10edec7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T22:59:02Z\\\",\\\"message\\\":\\\"ernal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 22:59:02.134707 6885 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-daemon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1202 22:59:02.134793 6885 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:59:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l8lp5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pz9ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.767824 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vhm2r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9f99205-0a32-4a74-ad9e-c0a79aa66d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ec8e3f725a9f00d1ee57d57d56f537077b5004dbd9314aae3ec85e3a09287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv57c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vhm2r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.783294 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zx478\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vx6p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.799952 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95d5b727-e078-49d6-8652-555f87754b11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b127904027e78efb8b5d36a8e045431aa85698b906f13c0445a7897ec658247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76a804238f87925935809a984824635f5b4e6dba3af1297e3a80f9b0cbf5039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.829125 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.835253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.835308 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.835369 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.835393 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.835462 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.864008 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf29d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f517677-195d-4d43-ba46-e0f0aede7011\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e29b8190dbd82f4e2f0dd2fd54e14fdd5304fd8ae1c1025d7a985e6a1b95f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dt6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf29d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.898498 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dbb1c8d-0a1a-4f7e-b853-f326cbde77e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf7df453a9e604c1c1b6288f69033a838f9072c366f10b88faae5d32d1d8dcec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e815e640c6ff8db33804d3de5d4825fde92c1200a31db299b7ab968b6a33edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767b5b251b8c5f68bf74aa9865b664293850c5f2047cba9c04c3b19f45c93265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7946bb5bf584893edf9a166c06fdc937e203f300b4d638b30ffcfa39b98b58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ec34d2590c36fdbe9a7a39e9149dd5f87250ac93b758d2fb0815828a44f949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f98db1683931c1d65960a08ef2b7fe0d6d97d362f5435dbc20346c9ee79abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27f98db1683931c1d65960a08ef2b7fe0d6d97d362f5435dbc20346c9ee79abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://029a3ad2cdb54925355602dbd246a088a9d5bb0b20bc6da2945cfc6843711803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://029a3ad2cdb54925355602dbd246a088a9d5bb0b20bc6da2945cfc6843711803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://07af7bc135d35fbb1e3f2e2be87aa112ceaadb2cb846b8af1583d8aabca3d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07af7bc135d35fbb1e3f2e2be87aa112ceaadb2cb846b8af1583d8aabca3d83f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.914143 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.926999 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ef11e3b-7757-4286-9684-6d4cd3bf924f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97f98df328fe250afd95449707372ef1aea43e355ee20edf8366f510a888b9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rf98z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-snl4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.938173 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.938221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.938231 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.938253 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.938265 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:11Z","lastTransitionTime":"2025-12-02T22:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.940823 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6f86958-173b-4746-8493-f8fe5f70a897\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eb794b37fcff5149e08f759cd8d4ad6b06be8e9a29cf0e42d04a963533a552a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5db0a9a2093fb2426b4354d9b9ad46f0e83f73472c38ff37ee159dcaea383fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5s5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:58:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xqdfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.957266 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2983aeb0-e38e-4be7-88d4-2a2fc720f014\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T22:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T22:57:59Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 22:57:54.046508 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 22:57:54.048559 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-21787677/tls.crt::/tmp/serving-cert-21787677/tls.key\\\\\\\"\\\\nI1202 22:57:59.878445 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 22:57:59.882352 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 22:57:59.882377 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 22:57:59.882397 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 22:57:59.882404 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 22:57:59.889228 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 22:57:59.889260 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1202 22:57:59.889256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1202 22:57:59.889268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 22:57:59.889291 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 22:57:59.889295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 22:57:59.889299 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 22:57:59.889302 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1202 22:57:59.892242 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:57:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T22:57:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T22:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T22:57:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:11 crc kubenswrapper[4903]: I1202 22:59:11.973498 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T22:58:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71a8643ea97f6edef5a93aa0369656bd7d630d555bd3a7c891b71305353faeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T22:58:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T22:59:11Z is after 2025-08-24T17:21:41Z" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.041443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.041527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.041546 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.041572 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.041594 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.144034 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.144082 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.144098 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.144116 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.144128 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.248765 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.248813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.248824 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.248843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.248855 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.352619 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.352696 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.352708 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.352728 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.352741 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.456322 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.456354 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.456363 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.456379 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.456387 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.559529 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.559595 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.559620 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.559683 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.559709 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.662840 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.662885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.662901 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.662923 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.662939 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.765818 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.765877 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.765896 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.765920 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.765937 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.869553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.869627 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.869645 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.869704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.869724 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.972858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.972935 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.972952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.972977 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:12 crc kubenswrapper[4903]: I1202 22:59:12.972999 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:12Z","lastTransitionTime":"2025-12-02T22:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.076010 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.076077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.076096 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.076121 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.076137 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.178470 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.178524 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.178533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.178546 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.178555 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.281825 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.281916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.281941 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.281974 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.281992 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.385396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.385461 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.385478 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.385505 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.385522 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.488466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.488858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.488993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.489158 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.489353 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.592505 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.592580 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.592687 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.592718 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.592740 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.611993 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.611989 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.612191 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.612191 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:13 crc kubenswrapper[4903]: E1202 22:59:13.612361 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:13 crc kubenswrapper[4903]: E1202 22:59:13.612530 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:13 crc kubenswrapper[4903]: E1202 22:59:13.612744 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:13 crc kubenswrapper[4903]: E1202 22:59:13.613049 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.695742 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.695846 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.695866 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.695948 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.696738 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.800128 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.800198 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.800232 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.800262 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.800284 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.903529 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.903587 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.903605 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.903628 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:13 crc kubenswrapper[4903]: I1202 22:59:13.903644 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:13Z","lastTransitionTime":"2025-12-02T22:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.006764 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.006830 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.006859 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.006888 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.006912 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.110529 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.110587 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.110603 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.110630 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.110648 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.213365 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.213420 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.213438 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.213462 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.213481 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.317157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.317222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.317242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.317268 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.317286 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.420533 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.420616 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.420637 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.420699 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.420720 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.523858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.523931 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.523943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.523960 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.523972 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.627255 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.627386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.627405 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.627431 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.627450 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.730975 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.731040 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.731063 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.731093 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.731116 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.834046 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.834371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.834602 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.834807 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.834958 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.937682 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.937758 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.937782 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.937813 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:14 crc kubenswrapper[4903]: I1202 22:59:14.937835 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:14Z","lastTransitionTime":"2025-12-02T22:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.040878 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.040943 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.040959 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.040981 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.040993 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.144233 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.144583 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.144780 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.144955 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.145092 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.248330 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.248719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.248870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.248993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.249123 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.384382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.384442 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.384450 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.384466 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.384475 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.487890 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.487971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.487992 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.488021 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.488041 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.592373 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.592524 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.592615 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.592741 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.592825 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.611637 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.611782 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.611897 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:15 crc kubenswrapper[4903]: E1202 22:59:15.612006 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.612047 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:15 crc kubenswrapper[4903]: E1202 22:59:15.612284 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:15 crc kubenswrapper[4903]: E1202 22:59:15.612362 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:15 crc kubenswrapper[4903]: E1202 22:59:15.612293 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.697407 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.697477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.697505 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.697537 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.697556 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.799339 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.799379 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.799390 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.799404 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.799413 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.902225 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.902258 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.902266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.902280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:15 crc kubenswrapper[4903]: I1202 22:59:15.902289 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:15Z","lastTransitionTime":"2025-12-02T22:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.005380 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.005482 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.005509 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.005547 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.005572 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.109632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.109766 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.109785 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.109808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.109830 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.212829 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.212900 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.212924 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.212956 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.212982 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.317017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.317087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.317105 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.317132 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.317154 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.420516 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.420612 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.420630 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.420684 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.420704 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.524229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.524315 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.524334 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.524369 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.524390 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.627359 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.627410 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.627425 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.627444 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.627458 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.628722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.628753 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.628766 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.628781 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.628803 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T22:59:16Z","lastTransitionTime":"2025-12-02T22:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.685384 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f"] Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.686099 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.689146 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.689779 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.690084 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.690506 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.755139 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.755211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51fd5d47-d525-4e81-9d73-2e5df25cd59c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.755291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51fd5d47-d525-4e81-9d73-2e5df25cd59c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.755361 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51fd5d47-d525-4e81-9d73-2e5df25cd59c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.755391 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.757975 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.757955941 podStartE2EDuration="1m13.757955941s" podCreationTimestamp="2025-12-02 22:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.756873854 +0000 UTC m=+95.465428177" watchObservedRunningTime="2025-12-02 22:59:16.757955941 +0000 UTC m=+95.466510264" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.778217 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.778194468 podStartE2EDuration="48.778194468s" podCreationTimestamp="2025-12-02 22:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.775747248 +0000 UTC m=+95.484301601" watchObservedRunningTime="2025-12-02 22:59:16.778194468 +0000 UTC m=+95.486748761" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856514 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51fd5d47-d525-4e81-9d73-2e5df25cd59c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856622 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51fd5d47-d525-4e81-9d73-2e5df25cd59c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51fd5d47-d525-4e81-9d73-2e5df25cd59c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856789 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856836 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.856964 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.857597 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51fd5d47-d525-4e81-9d73-2e5df25cd59c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.858232 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51fd5d47-d525-4e81-9d73-2e5df25cd59c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.891230 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51fd5d47-d525-4e81-9d73-2e5df25cd59c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.897439 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51fd5d47-d525-4e81-9d73-2e5df25cd59c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2x9f\" (UID: \"51fd5d47-d525-4e81-9d73-2e5df25cd59c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.909389 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s4nbg" podStartSLOduration=75.909363108 podStartE2EDuration="1m15.909363108s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.871803066 +0000 UTC m=+95.580357399" watchObservedRunningTime="2025-12-02 22:59:16.909363108 +0000 UTC m=+95.617917421" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.925542 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tjcvg" podStartSLOduration=75.925520033 podStartE2EDuration="1m15.925520033s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.911403219 +0000 UTC m=+95.619957542" watchObservedRunningTime="2025-12-02 22:59:16.925520033 +0000 UTC m=+95.634074326" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.925772 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vhm2r" podStartSLOduration=76.925764359 podStartE2EDuration="1m16.925764359s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.925237526 +0000 UTC m=+95.633791839" watchObservedRunningTime="2025-12-02 22:59:16.925764359 +0000 UTC m=+95.634318662" Dec 02 22:59:16 crc kubenswrapper[4903]: I1202 22:59:16.993184 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.993155929 podStartE2EDuration="25.993155929s" podCreationTimestamp="2025-12-02 22:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:16.977484536 +0000 UTC m=+95.686038869" watchObservedRunningTime="2025-12-02 22:59:16.993155929 +0000 UTC m=+95.701710252" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.008265 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.027779 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=8.027757347 podStartE2EDuration="8.027757347s" podCreationTimestamp="2025-12-02 22:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.025861009 +0000 UTC m=+95.734415382" watchObservedRunningTime="2025-12-02 22:59:17.027757347 +0000 UTC m=+95.736311640" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.028936 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zf29d" podStartSLOduration=78.028927226 podStartE2EDuration="1m18.028927226s" podCreationTimestamp="2025-12-02 22:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.004983165 +0000 UTC m=+95.713537458" watchObservedRunningTime="2025-12-02 22:59:17.028927226 +0000 UTC m=+95.737481529" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.056143 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podStartSLOduration=76.056119938 podStartE2EDuration="1m16.056119938s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.056108908 +0000 UTC m=+95.764663201" watchObservedRunningTime="2025-12-02 22:59:17.056119938 +0000 UTC m=+95.764674211" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.067932 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xqdfr" podStartSLOduration=76.067914663 podStartE2EDuration="1m16.067914663s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.06735774 +0000 UTC m=+95.775912023" watchObservedRunningTime="2025-12-02 22:59:17.067914663 +0000 UTC m=+95.776468946" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.082011 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.081986366 podStartE2EDuration="1m17.081986366s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.081975116 +0000 UTC m=+95.790529399" watchObservedRunningTime="2025-12-02 22:59:17.081986366 +0000 UTC m=+95.790540649" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.611979 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.612055 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.612090 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:17 crc kubenswrapper[4903]: E1202 22:59:17.612150 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:17 crc kubenswrapper[4903]: E1202 22:59:17.612332 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.612367 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:17 crc kubenswrapper[4903]: E1202 22:59:17.612789 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:17 crc kubenswrapper[4903]: E1202 22:59:17.612873 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.613344 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:17 crc kubenswrapper[4903]: E1202 22:59:17.613539 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.783190 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" event={"ID":"51fd5d47-d525-4e81-9d73-2e5df25cd59c","Type":"ContainerStarted","Data":"c1fb5c7e296e6cdda489f7b4e7806f32e98371e3f97bd89721911edcdd0a341a"} Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.783265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" event={"ID":"51fd5d47-d525-4e81-9d73-2e5df25cd59c","Type":"ContainerStarted","Data":"0cc42cc5e5138e84015226f8d22a73fc4f07c4c93528e0c7c9ae71745b35bc5c"} Dec 02 22:59:17 crc kubenswrapper[4903]: I1202 22:59:17.806492 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2x9f" podStartSLOduration=77.806460772 podStartE2EDuration="1m17.806460772s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:17.803813776 +0000 UTC m=+96.512368069" watchObservedRunningTime="2025-12-02 22:59:17.806460772 +0000 UTC m=+96.515015095" Dec 02 22:59:19 crc kubenswrapper[4903]: I1202 22:59:19.081273 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.081425 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.081575 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs podName:e7bdaec4-1392-4f87-ba0b-f53c76e47cf4 nodeName:}" failed. No retries permitted until 2025-12-02 23:00:23.081538595 +0000 UTC m=+161.790092918 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs") pod "network-metrics-daemon-8vx6p" (UID: "e7bdaec4-1392-4f87-ba0b-f53c76e47cf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 22:59:19 crc kubenswrapper[4903]: I1202 22:59:19.612298 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:19 crc kubenswrapper[4903]: I1202 22:59:19.612335 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.613029 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:19 crc kubenswrapper[4903]: I1202 22:59:19.612212 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.613257 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:19 crc kubenswrapper[4903]: I1202 22:59:19.612335 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.613461 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:19 crc kubenswrapper[4903]: E1202 22:59:19.613603 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:21 crc kubenswrapper[4903]: I1202 22:59:21.611492 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:21 crc kubenswrapper[4903]: I1202 22:59:21.611527 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:21 crc kubenswrapper[4903]: I1202 22:59:21.611495 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:21 crc kubenswrapper[4903]: I1202 22:59:21.611579 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:21 crc kubenswrapper[4903]: E1202 22:59:21.613818 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:21 crc kubenswrapper[4903]: E1202 22:59:21.614062 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:21 crc kubenswrapper[4903]: E1202 22:59:21.614204 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:21 crc kubenswrapper[4903]: E1202 22:59:21.614338 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:23 crc kubenswrapper[4903]: I1202 22:59:23.612383 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:23 crc kubenswrapper[4903]: I1202 22:59:23.612451 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:23 crc kubenswrapper[4903]: I1202 22:59:23.612590 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:23 crc kubenswrapper[4903]: E1202 22:59:23.612734 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:23 crc kubenswrapper[4903]: I1202 22:59:23.612801 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:23 crc kubenswrapper[4903]: E1202 22:59:23.613009 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:23 crc kubenswrapper[4903]: E1202 22:59:23.613057 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:23 crc kubenswrapper[4903]: E1202 22:59:23.613326 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:25 crc kubenswrapper[4903]: I1202 22:59:25.612206 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:25 crc kubenswrapper[4903]: I1202 22:59:25.612350 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:25 crc kubenswrapper[4903]: E1202 22:59:25.613208 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:25 crc kubenswrapper[4903]: I1202 22:59:25.613614 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:25 crc kubenswrapper[4903]: E1202 22:59:25.613637 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:25 crc kubenswrapper[4903]: I1202 22:59:25.613767 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:25 crc kubenswrapper[4903]: E1202 22:59:25.614197 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:25 crc kubenswrapper[4903]: E1202 22:59:25.614600 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:27 crc kubenswrapper[4903]: I1202 22:59:27.611643 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:27 crc kubenswrapper[4903]: I1202 22:59:27.611772 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:27 crc kubenswrapper[4903]: I1202 22:59:27.611884 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:27 crc kubenswrapper[4903]: E1202 22:59:27.611868 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:27 crc kubenswrapper[4903]: E1202 22:59:27.612031 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:27 crc kubenswrapper[4903]: E1202 22:59:27.612156 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:27 crc kubenswrapper[4903]: I1202 22:59:27.612817 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:27 crc kubenswrapper[4903]: E1202 22:59:27.613007 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:28 crc kubenswrapper[4903]: I1202 22:59:28.612983 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:28 crc kubenswrapper[4903]: E1202 22:59:28.613206 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:59:29 crc kubenswrapper[4903]: I1202 22:59:29.611768 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:29 crc kubenswrapper[4903]: I1202 22:59:29.611844 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:29 crc kubenswrapper[4903]: I1202 22:59:29.611810 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:29 crc kubenswrapper[4903]: E1202 22:59:29.612000 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:29 crc kubenswrapper[4903]: I1202 22:59:29.612034 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:29 crc kubenswrapper[4903]: E1202 22:59:29.612243 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:29 crc kubenswrapper[4903]: E1202 22:59:29.612339 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:29 crc kubenswrapper[4903]: E1202 22:59:29.612470 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:31 crc kubenswrapper[4903]: I1202 22:59:31.612087 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:31 crc kubenswrapper[4903]: I1202 22:59:31.612279 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:31 crc kubenswrapper[4903]: I1202 22:59:31.612128 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:31 crc kubenswrapper[4903]: E1202 22:59:31.614781 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:31 crc kubenswrapper[4903]: I1202 22:59:31.614865 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:31 crc kubenswrapper[4903]: E1202 22:59:31.614950 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:31 crc kubenswrapper[4903]: E1202 22:59:31.615045 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:31 crc kubenswrapper[4903]: E1202 22:59:31.615141 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:33 crc kubenswrapper[4903]: I1202 22:59:33.725535 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:33 crc kubenswrapper[4903]: I1202 22:59:33.725766 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:33 crc kubenswrapper[4903]: E1202 22:59:33.725998 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:33 crc kubenswrapper[4903]: I1202 22:59:33.726142 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:33 crc kubenswrapper[4903]: I1202 22:59:33.726436 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:33 crc kubenswrapper[4903]: E1202 22:59:33.726342 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:33 crc kubenswrapper[4903]: E1202 22:59:33.726555 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:33 crc kubenswrapper[4903]: E1202 22:59:33.726868 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.846549 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/1.log" Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.847388 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/0.log" Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.847453 4903 generic.go:334] "Generic (PLEG): container finished" podID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" containerID="940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50" exitCode=1 Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.847495 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerDied","Data":"940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50"} Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.847540 4903 scope.go:117] "RemoveContainer" containerID="4f856b0456cb9be1ba231cec47e70122f254e8e7aa51be21e63439356e5d609b" Dec 02 22:59:34 crc kubenswrapper[4903]: I1202 22:59:34.848279 4903 scope.go:117] "RemoveContainer" containerID="940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50" Dec 02 22:59:34 crc kubenswrapper[4903]: E1202 22:59:34.849086 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s4nbg_openshift-multus(a689512c-b6fd-4ffe-af54-dbb8f45ab9e5)\"" pod="openshift-multus/multus-s4nbg" podUID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" Dec 02 22:59:35 crc kubenswrapper[4903]: I1202 22:59:35.612260 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:35 crc kubenswrapper[4903]: E1202 22:59:35.612424 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:35 crc kubenswrapper[4903]: I1202 22:59:35.612289 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:35 crc kubenswrapper[4903]: I1202 22:59:35.612494 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:35 crc kubenswrapper[4903]: I1202 22:59:35.612513 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:35 crc kubenswrapper[4903]: E1202 22:59:35.612676 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:35 crc kubenswrapper[4903]: E1202 22:59:35.612714 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:35 crc kubenswrapper[4903]: E1202 22:59:35.612779 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:35 crc kubenswrapper[4903]: I1202 22:59:35.851804 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/1.log" Dec 02 22:59:37 crc kubenswrapper[4903]: I1202 22:59:37.613032 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:37 crc kubenswrapper[4903]: I1202 22:59:37.613137 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:37 crc kubenswrapper[4903]: I1202 22:59:37.613078 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:37 crc kubenswrapper[4903]: E1202 22:59:37.613262 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:37 crc kubenswrapper[4903]: E1202 22:59:37.613409 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:37 crc kubenswrapper[4903]: E1202 22:59:37.613524 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:37 crc kubenswrapper[4903]: I1202 22:59:37.613626 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:37 crc kubenswrapper[4903]: E1202 22:59:37.613745 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:39 crc kubenswrapper[4903]: I1202 22:59:39.612305 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:39 crc kubenswrapper[4903]: I1202 22:59:39.612346 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:39 crc kubenswrapper[4903]: I1202 22:59:39.612559 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:39 crc kubenswrapper[4903]: E1202 22:59:39.612551 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:39 crc kubenswrapper[4903]: I1202 22:59:39.612628 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:39 crc kubenswrapper[4903]: E1202 22:59:39.612826 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:39 crc kubenswrapper[4903]: E1202 22:59:39.612920 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:39 crc kubenswrapper[4903]: E1202 22:59:39.613056 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:39 crc kubenswrapper[4903]: I1202 22:59:39.614479 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:39 crc kubenswrapper[4903]: E1202 22:59:39.614731 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pz9ff_openshift-ovn-kubernetes(99ab90b8-4bb9-418c-8b55-19c4c10edec7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.583044 4903 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 22:59:41 crc kubenswrapper[4903]: I1202 22:59:41.612000 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:41 crc kubenswrapper[4903]: I1202 22:59:41.612062 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:41 crc kubenswrapper[4903]: I1202 22:59:41.612130 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:41 crc kubenswrapper[4903]: I1202 22:59:41.612072 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.614329 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.614524 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.614747 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.614854 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:41 crc kubenswrapper[4903]: E1202 22:59:41.707901 4903 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:59:43 crc kubenswrapper[4903]: I1202 22:59:43.611936 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:43 crc kubenswrapper[4903]: I1202 22:59:43.611982 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:43 crc kubenswrapper[4903]: I1202 22:59:43.612011 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:43 crc kubenswrapper[4903]: I1202 22:59:43.611986 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:43 crc kubenswrapper[4903]: E1202 22:59:43.612189 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:43 crc kubenswrapper[4903]: E1202 22:59:43.612319 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:43 crc kubenswrapper[4903]: E1202 22:59:43.612744 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:43 crc kubenswrapper[4903]: E1202 22:59:43.612503 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:45 crc kubenswrapper[4903]: I1202 22:59:45.612144 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:45 crc kubenswrapper[4903]: I1202 22:59:45.612269 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:45 crc kubenswrapper[4903]: E1202 22:59:45.612328 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:45 crc kubenswrapper[4903]: I1202 22:59:45.612342 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:45 crc kubenswrapper[4903]: E1202 22:59:45.612401 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:45 crc kubenswrapper[4903]: E1202 22:59:45.612578 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:45 crc kubenswrapper[4903]: I1202 22:59:45.612838 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:45 crc kubenswrapper[4903]: E1202 22:59:45.612971 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:46 crc kubenswrapper[4903]: E1202 22:59:46.709741 4903 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:59:47 crc kubenswrapper[4903]: I1202 22:59:47.611883 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:47 crc kubenswrapper[4903]: I1202 22:59:47.611943 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:47 crc kubenswrapper[4903]: E1202 22:59:47.612129 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:47 crc kubenswrapper[4903]: I1202 22:59:47.612208 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:47 crc kubenswrapper[4903]: E1202 22:59:47.612417 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:47 crc kubenswrapper[4903]: E1202 22:59:47.612605 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:47 crc kubenswrapper[4903]: I1202 22:59:47.612824 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:47 crc kubenswrapper[4903]: E1202 22:59:47.613012 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:48 crc kubenswrapper[4903]: I1202 22:59:48.612176 4903 scope.go:117] "RemoveContainer" containerID="940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50" Dec 02 22:59:48 crc kubenswrapper[4903]: I1202 22:59:48.899252 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/1.log" Dec 02 22:59:48 crc kubenswrapper[4903]: I1202 22:59:48.899326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerStarted","Data":"96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267"} Dec 02 22:59:49 crc kubenswrapper[4903]: I1202 22:59:49.612347 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:49 crc kubenswrapper[4903]: I1202 22:59:49.612519 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:49 crc kubenswrapper[4903]: E1202 22:59:49.612530 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:49 crc kubenswrapper[4903]: I1202 22:59:49.612740 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:49 crc kubenswrapper[4903]: E1202 22:59:49.612804 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:49 crc kubenswrapper[4903]: E1202 22:59:49.612934 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:49 crc kubenswrapper[4903]: I1202 22:59:49.613347 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:49 crc kubenswrapper[4903]: E1202 22:59:49.613606 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:50 crc kubenswrapper[4903]: I1202 22:59:50.613215 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 22:59:50 crc kubenswrapper[4903]: I1202 22:59:50.908126 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/3.log" Dec 02 22:59:50 crc kubenswrapper[4903]: I1202 22:59:50.911101 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerStarted","Data":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} Dec 02 22:59:50 crc kubenswrapper[4903]: I1202 22:59:50.911624 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.506418 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podStartSLOduration=110.506388187 podStartE2EDuration="1m50.506388187s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 22:59:50.984957291 +0000 UTC m=+129.693511574" watchObservedRunningTime="2025-12-02 22:59:51.506388187 +0000 UTC m=+130.214942460" Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.508109 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vx6p"] Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.508318 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:51 crc kubenswrapper[4903]: E1202 22:59:51.508429 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.612056 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.612110 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:51 crc kubenswrapper[4903]: I1202 22:59:51.612164 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:51 crc kubenswrapper[4903]: E1202 22:59:51.613335 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:51 crc kubenswrapper[4903]: E1202 22:59:51.613683 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:51 crc kubenswrapper[4903]: E1202 22:59:51.613990 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:51 crc kubenswrapper[4903]: E1202 22:59:51.710389 4903 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 22:59:53 crc kubenswrapper[4903]: I1202 22:59:53.611947 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:53 crc kubenswrapper[4903]: I1202 22:59:53.612001 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:53 crc kubenswrapper[4903]: I1202 22:59:53.612046 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:53 crc kubenswrapper[4903]: I1202 22:59:53.611971 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:53 crc kubenswrapper[4903]: E1202 22:59:53.612215 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:53 crc kubenswrapper[4903]: E1202 22:59:53.612370 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:53 crc kubenswrapper[4903]: E1202 22:59:53.612507 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:53 crc kubenswrapper[4903]: E1202 22:59:53.612599 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:55 crc kubenswrapper[4903]: I1202 22:59:55.612304 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:55 crc kubenswrapper[4903]: I1202 22:59:55.612359 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:55 crc kubenswrapper[4903]: E1202 22:59:55.612890 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 22:59:55 crc kubenswrapper[4903]: I1202 22:59:55.612550 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:55 crc kubenswrapper[4903]: I1202 22:59:55.612516 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:55 crc kubenswrapper[4903]: E1202 22:59:55.613065 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 22:59:55 crc kubenswrapper[4903]: E1202 22:59:55.613305 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 22:59:55 crc kubenswrapper[4903]: E1202 22:59:55.613368 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vx6p" podUID="e7bdaec4-1392-4f87-ba0b-f53c76e47cf4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.587440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.612277 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.612310 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.612340 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.612307 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.614304 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.614312 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.617095 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.617212 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.617277 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.619009 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.643164 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.643975 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.647462 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.647784 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.648065 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.648229 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.650908 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffb4r"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.651736 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.653465 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.661033 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.661785 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.663622 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.665988 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvnkb"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.666473 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.667089 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.667243 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.667456 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.668061 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.668369 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.669900 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.670133 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.675171 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.675532 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.675994 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.676475 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.676758 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.676901 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677432 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677482 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677516 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677447 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677853 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.677971 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.678064 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.680899 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.681475 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.681791 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.681976 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.682268 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.682390 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.682633 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.689377 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.690625 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.692323 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.701746 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.715691 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.716062 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.717353 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.717739 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6vbc"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.717963 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.718381 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.719015 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.719202 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.719362 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.720582 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.720781 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721013 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721013 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721154 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721245 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721308 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721366 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721247 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721425 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721564 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.721682 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.722570 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.723125 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zzfgm"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.723584 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kvzf7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.726458 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.726719 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.726982 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.727317 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.725317 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.727709 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.728116 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.725390 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.725357 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p7r9\" (UniqueName: \"kubernetes.io/projected/8309b769-d214-4620-a891-ae8ae36630cd-kube-api-access-9p7r9\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733338 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2656003-d0f9-4d65-8744-1e394226a359-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733364 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5q5\" (UniqueName: \"kubernetes.io/projected/929c87de-7dc4-40bf-9fe9-85229a13dca1-kube-api-access-8g5q5\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733387 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733410 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733427 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2656003-d0f9-4d65-8744-1e394226a359-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733448 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733464 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733482 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkpl\" (UniqueName: \"kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733525 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-auth-proxy-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733548 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-encryption-config\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733574 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-config\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733592 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j672h\" (UniqueName: \"kubernetes.io/projected/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-kube-api-access-j672h\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733626 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8309b769-d214-4620-a891-ae8ae36630cd-audit-dir\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733667 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-trusted-ca\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733701 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733725 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733741 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wds8q\" (UniqueName: \"kubernetes.io/projected/7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5-kube-api-access-wds8q\") pod \"downloads-7954f5f757-zzfgm\" (UID: \"7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5\") " pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733761 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733778 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/da60db71-3425-41be-8c51-99e7326f559f-machine-approver-tls\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733793 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733811 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-images\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733826 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c87de-7dc4-40bf-9fe9-85229a13dca1-serving-cert\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733854 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skx5p\" (UniqueName: \"kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733873 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-config\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733898 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-serving-cert\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733921 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733961 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-etcd-client\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733978 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.733998 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-audit-policies\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.734019 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.734044 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mt4\" (UniqueName: \"kubernetes.io/projected/da60db71-3425-41be-8c51-99e7326f559f-kube-api-access-d9mt4\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.737366 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.738312 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.738604 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.740564 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.742501 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.742855 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.746022 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.753044 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.754775 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.754831 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.754872 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.754937 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.755073 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.755160 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.755802 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.756201 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757553 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757640 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757700 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757723 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757773 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757845 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.757956 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.758261 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.758377 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.760596 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.768296 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.768352 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.769888 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.770937 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.771104 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.771216 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.783448 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.789839 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.798294 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.798339 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffb4r"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.798432 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.800434 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6vbc"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.800766 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pkxw5"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.801182 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.801594 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdkzh"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.802061 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.802329 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.802907 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.804211 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.804392 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.804582 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.804764 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.805167 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.808394 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.808590 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.808706 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.808861 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809024 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809157 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809258 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809322 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809478 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809265 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809756 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.809889 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.817099 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgdbg"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.817791 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818245 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818408 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818626 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818675 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818709 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818976 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.818899 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkmcq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.819786 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9spjq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.820226 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.820436 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.821460 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.822270 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.824563 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvnkb"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.824599 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.825078 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.825357 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.826265 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.826987 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.831552 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.831984 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.833597 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834384 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834601 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834631 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c4b26c-f83a-45c7-92e9-da974b729b51-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834810 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.834927 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836029 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836317 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836709 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2656003-d0f9-4d65-8744-1e394226a359-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836774 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-encryption-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836790 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdk7t\" (UniqueName: \"kubernetes.io/projected/245513eb-a40a-4eff-80ed-c7070eb94f8a-kube-api-access-zdk7t\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836808 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836827 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkpl\" (UniqueName: \"kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836843 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836860 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836877 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbh9c\" (UniqueName: \"kubernetes.io/projected/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-kube-api-access-hbh9c\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836897 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-auth-proxy-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836925 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-config\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-encryption-config\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836976 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/a8d2e09c-a22c-4581-830c-ad25ff946f4a-kube-api-access-ltsts\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.836996 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8bv\" (UniqueName: \"kubernetes.io/projected/e1c4b26c-f83a-45c7-92e9-da974b729b51-kube-api-access-7k8bv\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837015 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837033 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j672h\" (UniqueName: \"kubernetes.io/projected/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-kube-api-access-j672h\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837080 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-config\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837105 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-serving-cert\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837123 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit-dir\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837140 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8309b769-d214-4620-a891-ae8ae36630cd-audit-dir\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-trusted-ca\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837184 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbxg\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-kube-api-access-5tbxg\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837203 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-image-import-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837236 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837258 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wds8q\" (UniqueName: \"kubernetes.io/projected/7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5-kube-api-access-wds8q\") pod \"downloads-7954f5f757-zzfgm\" (UID: \"7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5\") " pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837284 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-client\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837332 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837380 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/da60db71-3425-41be-8c51-99e7326f559f-machine-approver-tls\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ljq\" (UniqueName: \"kubernetes.io/projected/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-kube-api-access-t2ljq\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-images\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837450 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c87de-7dc4-40bf-9fe9-85229a13dca1-serving-cert\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837471 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skx5p\" (UniqueName: \"kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-config\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837572 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-serving-cert\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837594 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837615 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/245513eb-a40a-4eff-80ed-c7070eb94f8a-serving-cert\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837638 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837679 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-etcd-client\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837722 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-node-pullsecrets\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837744 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-audit-policies\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.837929 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c4b26c-f83a-45c7-92e9-da974b729b51-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838015 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mt4\" (UniqueName: \"kubernetes.io/projected/da60db71-3425-41be-8c51-99e7326f559f-kube-api-access-d9mt4\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838079 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838101 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8d2e09c-a22c-4581-830c-ad25ff946f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838466 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p7r9\" (UniqueName: \"kubernetes.io/projected/8309b769-d214-4620-a891-ae8ae36630cd-kube-api-access-9p7r9\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838499 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2656003-d0f9-4d65-8744-1e394226a359-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.838583 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5q5\" (UniqueName: \"kubernetes.io/projected/929c87de-7dc4-40bf-9fe9-85229a13dca1-kube-api-access-8g5q5\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.839518 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/da60db71-3425-41be-8c51-99e7326f559f-auth-proxy-config\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.839963 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-images\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.840708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.841412 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.841841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.841996 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8309b769-d214-4620-a891-ae8ae36630cd-audit-dir\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.842381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-config\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.842446 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2656003-d0f9-4d65-8744-1e394226a359-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.842959 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929c87de-7dc4-40bf-9fe9-85229a13dca1-trusted-ca\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.844087 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.845081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.845750 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.845809 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.845936 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-config\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.846333 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8309b769-d214-4620-a891-ae8ae36630cd-audit-policies\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.846606 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.847964 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-etcd-client\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.848036 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.848971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/da60db71-3425-41be-8c51-99e7326f559f-machine-approver-tls\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.849905 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.850573 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.850806 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.851384 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.851779 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/929c87de-7dc4-40bf-9fe9-85229a13dca1-serving-cert\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.852950 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.853333 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.853906 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2656003-d0f9-4d65-8744-1e394226a359-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.854637 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.854867 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n67tl"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.855231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-serving-cert\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.856424 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.857096 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.857318 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.858102 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.858658 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.863459 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8309b769-d214-4620-a891-ae8ae36630cd-encryption-config\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.872817 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.873046 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lsg46"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.875145 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.876265 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.876984 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.877170 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.878222 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.878786 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.879537 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.880261 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.881315 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdkzh"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.882351 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.883350 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.884394 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.885466 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zzfgm"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.887079 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.888314 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.889270 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wwsfb"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.890402 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m4d4c"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.890601 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwsfb" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.891038 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.891470 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9spjq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.892882 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.894271 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.895222 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.895926 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.897743 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.898872 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgdbg"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.900180 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kvzf7"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.901351 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.902495 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.905193 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.906388 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.907677 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lsg46"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.908868 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n67tl"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.910316 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.911869 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.912937 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.914009 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.914965 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.915005 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkmcq"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.916191 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.917152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.918410 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.920318 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4d4c"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.921529 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwsfb"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.922536 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hhjwp"] Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.923225 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.935191 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940002 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-client\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940041 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e426530-7180-403a-9810-6612e19b1110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940072 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940094 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ljq\" (UniqueName: \"kubernetes.io/projected/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-kube-api-access-t2ljq\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940138 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nvk\" (UniqueName: \"kubernetes.io/projected/c964b5fc-f02f-481c-abd3-2f567b9e98e6-kube-api-access-d7nvk\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940166 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nqjz\" (UniqueName: \"kubernetes.io/projected/2e426530-7180-403a-9810-6612e19b1110-kube-api-access-2nqjz\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940194 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5j2f\" (UniqueName: \"kubernetes.io/projected/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-kube-api-access-r5j2f\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940220 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/245513eb-a40a-4eff-80ed-c7070eb94f8a-serving-cert\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940282 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-node-pullsecrets\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940360 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c4b26c-f83a-45c7-92e9-da974b729b51-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940391 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8d2e09c-a22c-4581-830c-ad25ff946f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940398 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-node-pullsecrets\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940422 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940464 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c4b26c-f83a-45c7-92e9-da974b729b51-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940488 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e426530-7180-403a-9810-6612e19b1110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940516 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940543 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940565 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-encryption-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940614 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdk7t\" (UniqueName: \"kubernetes.io/projected/245513eb-a40a-4eff-80ed-c7070eb94f8a-kube-api-access-zdk7t\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940641 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-srv-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940682 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940706 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-service-ca-bundle\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940739 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbh9c\" (UniqueName: \"kubernetes.io/projected/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-kube-api-access-hbh9c\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940762 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.940829 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.941383 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.941580 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c4b26c-f83a-45c7-92e9-da974b729b51-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.941640 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-stats-auth\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.941768 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942207 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-service-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942402 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-config\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942434 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942444 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942510 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/a8d2e09c-a22c-4581-830c-ad25ff946f4a-kube-api-access-ltsts\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942677 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942785 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8bv\" (UniqueName: \"kubernetes.io/projected/e1c4b26c-f83a-45c7-92e9-da974b729b51-kube-api-access-7k8bv\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942866 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942902 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-profile-collector-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942930 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-serving-cert\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942947 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit-dir\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942964 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnvqt\" (UniqueName: \"kubernetes.io/projected/d2d0b975-a970-41c7-a703-d2771bc3fcc8-kube-api-access-dnvqt\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.942992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbxg\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-kube-api-access-5tbxg\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943011 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8sn\" (UniqueName: \"kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943035 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-image-import-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943053 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-default-certificate\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943069 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-metrics-certs\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943084 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c964b5fc-f02f-481c-abd3-2f567b9e98e6-metrics-tls\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-audit-dir\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943200 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/245513eb-a40a-4eff-80ed-c7070eb94f8a-config\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-serving-cert\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.943975 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-image-import-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.944403 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.946040 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-encryption-config\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.946543 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/245513eb-a40a-4eff-80ed-c7070eb94f8a-serving-cert\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.946634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c4b26c-f83a-45c7-92e9-da974b729b51-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.946769 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8d2e09c-a22c-4581-830c-ad25ff946f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.947438 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-serving-cert\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.948049 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-etcd-client\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.948251 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.955212 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.974442 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 22:59:57 crc kubenswrapper[4903]: I1202 22:59:57.995125 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.015066 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.035619 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044337 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e426530-7180-403a-9810-6612e19b1110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044388 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044413 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044447 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-srv-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044469 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-service-ca-bundle\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044494 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044523 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-stats-auth\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044582 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044634 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-profile-collector-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnvqt\" (UniqueName: \"kubernetes.io/projected/d2d0b975-a970-41c7-a703-d2771bc3fcc8-kube-api-access-dnvqt\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8sn\" (UniqueName: \"kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044753 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-default-certificate\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044772 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-metrics-certs\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044789 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c964b5fc-f02f-481c-abd3-2f567b9e98e6-metrics-tls\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044826 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e426530-7180-403a-9810-6612e19b1110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044871 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nvk\" (UniqueName: \"kubernetes.io/projected/c964b5fc-f02f-481c-abd3-2f567b9e98e6-kube-api-access-d7nvk\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044898 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nqjz\" (UniqueName: \"kubernetes.io/projected/2e426530-7180-403a-9810-6612e19b1110-kube-api-access-2nqjz\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044918 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5j2f\" (UniqueName: \"kubernetes.io/projected/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-kube-api-access-r5j2f\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.044958 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.046268 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.046600 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-service-ca-bundle\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.047500 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.048626 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.049028 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.049248 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.049926 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-profile-collector-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.050922 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-default-certificate\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.051182 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-metrics-certs\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.051439 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.051441 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c964b5fc-f02f-481c-abd3-2f567b9e98e6-metrics-tls\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.052018 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-stats-auth\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.052672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2d0b975-a970-41c7-a703-d2771bc3fcc8-srv-cert\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.054960 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.075511 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.094724 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.114431 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.135196 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.156116 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.175358 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.195432 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.214913 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.235227 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.255339 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.275563 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.295136 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.316528 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.335934 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.355228 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.375021 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.395906 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.415094 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.435123 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.438239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e426530-7180-403a-9810-6612e19b1110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.455176 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.475500 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.480779 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e426530-7180-403a-9810-6612e19b1110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.496253 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.514901 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.534813 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.555349 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.576483 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.605833 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.616052 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.635313 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.655380 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.676554 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.696067 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.733053 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkpl\" (UniqueName: \"kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl\") pod \"controller-manager-879f6c89f-9mlb7\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.750851 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5q5\" (UniqueName: \"kubernetes.io/projected/929c87de-7dc4-40bf-9fe9-85229a13dca1-kube-api-access-8g5q5\") pod \"console-operator-58897d9998-kvzf7\" (UID: \"929c87de-7dc4-40bf-9fe9-85229a13dca1\") " pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.776669 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.790641 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mt4\" (UniqueName: \"kubernetes.io/projected/da60db71-3425-41be-8c51-99e7326f559f-kube-api-access-d9mt4\") pod \"machine-approver-56656f9798-m4vjn\" (UID: \"da60db71-3425-41be-8c51-99e7326f559f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.815348 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wds8q\" (UniqueName: \"kubernetes.io/projected/7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5-kube-api-access-wds8q\") pod \"downloads-7954f5f757-zzfgm\" (UID: \"7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5\") " pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.815427 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.823128 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.852466 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j672h\" (UniqueName: \"kubernetes.io/projected/ec790a4a-c562-4035-ba10-9ac0c8baf6c6-kube-api-access-j672h\") pod \"machine-api-operator-5694c8668f-lvnkb\" (UID: \"ec790a4a-c562-4035-ba10-9ac0c8baf6c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.853163 4903 request.go:700] Waited for 1.007515307s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.859011 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p7r9\" (UniqueName: \"kubernetes.io/projected/8309b769-d214-4620-a891-ae8ae36630cd-kube-api-access-9p7r9\") pod \"apiserver-7bbb656c7d-lsq8g\" (UID: \"8309b769-d214-4620-a891-ae8ae36630cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.876347 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.883458 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skx5p\" (UniqueName: \"kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p\") pod \"route-controller-manager-6576b87f9c-pnzr7\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.896445 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.916386 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.956541 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.961426 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.976132 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 22:59:58 crc kubenswrapper[4903]: I1202 22:59:58.997028 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.009926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.014990 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.033984 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.035007 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.036770 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.042583 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.055482 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.077861 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.082702 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kvzf7"] Dec 02 22:59:59 crc kubenswrapper[4903]: W1202 22:59:59.092926 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929c87de_7dc4_40bf_9fe9_85229a13dca1.slice/crio-9877364d4cb6fd96272038d9281057f3a953ad9d0133c709fba963ff322d92d6 WatchSource:0}: Error finding container 9877364d4cb6fd96272038d9281057f3a953ad9d0133c709fba963ff322d92d6: Status 404 returned error can't find the container with id 9877364d4cb6fd96272038d9281057f3a953ad9d0133c709fba963ff322d92d6 Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.095000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.104818 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zzfgm"] Dec 02 22:59:59 crc kubenswrapper[4903]: W1202 22:59:59.128942 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7544e0_8c46_4f13_b8e4_a8aa2071a9f5.slice/crio-2f9e67eaee5a9854ee5798b6769f27caf7f8488a8b7e73186e75a165fa813d7c WatchSource:0}: Error finding container 2f9e67eaee5a9854ee5798b6769f27caf7f8488a8b7e73186e75a165fa813d7c: Status 404 returned error can't find the container with id 2f9e67eaee5a9854ee5798b6769f27caf7f8488a8b7e73186e75a165fa813d7c Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.137515 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.144482 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.158107 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.174124 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.182836 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.209337 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.217898 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.241265 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.255908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.274341 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.297990 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.315365 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.335069 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.335495 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 22:59:59 crc kubenswrapper[4903]: W1202 22:59:59.341755 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896cc150_3871_46e2_b1f5_c31c25c54014.slice/crio-91af0c717c1c25bb2e045696d2cb66627ca939c648aee25c3fc54bb502a39644 WatchSource:0}: Error finding container 91af0c717c1c25bb2e045696d2cb66627ca939c648aee25c3fc54bb502a39644: Status 404 returned error can't find the container with id 91af0c717c1c25bb2e045696d2cb66627ca939c648aee25c3fc54bb502a39644 Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.354448 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.374183 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.395142 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.414970 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.435275 4903 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.455575 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.476805 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.498735 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.509196 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g"] Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.514948 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvnkb"] Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.516576 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.535907 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.555411 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.575068 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.595260 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.615526 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.635925 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.654694 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.676854 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.716189 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.735581 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.755787 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.785176 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.795408 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.816984 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.834565 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.855020 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.873289 4903 request.go:700] Waited for 1.982045308s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.875577 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.908813 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.915413 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.934787 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.955029 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" event={"ID":"896cc150-3871-46e2-b1f5-c31c25c54014","Type":"ContainerStarted","Data":"91af0c717c1c25bb2e045696d2cb66627ca939c648aee25c3fc54bb502a39644"} Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.955613 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.957681 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" event={"ID":"2c5439fc-734f-4efa-838f-68900d9453ec","Type":"ContainerStarted","Data":"d1590f5d08d5021f7c4653ff6d53411599ec2a40aa5f6db4e0019af55c730e6e"} Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.959271 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zzfgm" event={"ID":"7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5","Type":"ContainerStarted","Data":"2f9e67eaee5a9854ee5798b6769f27caf7f8488a8b7e73186e75a165fa813d7c"} Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.961159 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" event={"ID":"929c87de-7dc4-40bf-9fe9-85229a13dca1","Type":"ContainerStarted","Data":"9877364d4cb6fd96272038d9281057f3a953ad9d0133c709fba963ff322d92d6"} Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.962918 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" event={"ID":"da60db71-3425-41be-8c51-99e7326f559f","Type":"ContainerStarted","Data":"78fdd8047f06a65d049e8b1c39a43cf3a62489515a7a34c4734292c13d72901a"} Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.974768 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 22:59:59 crc kubenswrapper[4903]: I1202 22:59:59.996790 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.047463 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ljq\" (UniqueName: \"kubernetes.io/projected/f2fec0cf-8c59-4ec1-ae18-91b0081c60bb-kube-api-access-t2ljq\") pod \"openshift-config-operator-7777fb866f-jr8l4\" (UID: \"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.066195 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdk7t\" (UniqueName: \"kubernetes.io/projected/245513eb-a40a-4eff-80ed-c7070eb94f8a-kube-api-access-zdk7t\") pod \"authentication-operator-69f744f599-s6vbc\" (UID: \"245513eb-a40a-4eff-80ed-c7070eb94f8a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.083943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbh9c\" (UniqueName: \"kubernetes.io/projected/46bb090b-9216-49d5-91d7-43cf3ee3bf4a-kube-api-access-hbh9c\") pod \"apiserver-76f77b778f-ffb4r\" (UID: \"46bb090b-9216-49d5-91d7-43cf3ee3bf4a\") " pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.103216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/a8d2e09c-a22c-4581-830c-ad25ff946f4a-kube-api-access-ltsts\") pod \"cluster-samples-operator-665b6dd947-kjsgz\" (UID: \"a8d2e09c-a22c-4581-830c-ad25ff946f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.127911 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8bv\" (UniqueName: \"kubernetes.io/projected/e1c4b26c-f83a-45c7-92e9-da974b729b51-kube-api-access-7k8bv\") pod \"openshift-apiserver-operator-796bbdcf4f-hwh9f\" (UID: \"e1c4b26c-f83a-45c7-92e9-da974b729b51\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.139607 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj"] Dec 02 23:00:00 crc kubenswrapper[4903]: E1202 23:00:00.140122 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-volume kube-api-access-hv6c8 secret-volume], unattached volumes=[], failed to process volumes=[config-volume kube-api-access-hv6c8 secret-volume]: context canceled" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj" podUID="9a385a5a-8a80-4d4d-8d00-b2543dfaf3fc" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.151071 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbxg\" (UniqueName: \"kubernetes.io/projected/e2656003-d0f9-4d65-8744-1e394226a359-kube-api-access-5tbxg\") pod \"cluster-image-registry-operator-dc59b4c8b-wvzlq\" (UID: \"e2656003-d0f9-4d65-8744-1e394226a359\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.161926 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv"] Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.163050 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.168365 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8sn\" (UniqueName: \"kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn\") pod \"console-f9d7485db-tjt6x\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.187404 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnvqt\" (UniqueName: \"kubernetes.io/projected/d2d0b975-a970-41c7-a703-d2771bc3fcc8-kube-api-access-dnvqt\") pod \"catalog-operator-68c6474976-flrnk\" (UID: \"d2d0b975-a970-41c7-a703-d2771bc3fcc8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.188815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.196243 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nvk\" (UniqueName: \"kubernetes.io/projected/c964b5fc-f02f-481c-abd3-2f567b9e98e6-kube-api-access-d7nvk\") pod \"dns-operator-744455d44c-wdkzh\" (UID: \"c964b5fc-f02f-481c-abd3-2f567b9e98e6\") " pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.250524 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv"] Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.251799 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nqjz\" (UniqueName: \"kubernetes.io/projected/2e426530-7180-403a-9810-6612e19b1110-kube-api-access-2nqjz\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7hl6\" (UID: \"2e426530-7180-403a-9810-6612e19b1110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.264449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5j2f\" (UniqueName: \"kubernetes.io/projected/d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745-kube-api-access-r5j2f\") pod \"router-default-5444994796-pkxw5\" (UID: \"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745\") " pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.968495 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj" Dec 02 23:00:00 crc kubenswrapper[4903]: I1202 23:00:00.981636 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492350 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492537 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492675 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492754 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.493199 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.493427 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492536 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.492877 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.494317 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.494398 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.499196 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.499245 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.499296 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.499405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fv9f\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.499771 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.499868 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:01.999855483 +0000 UTC m=+140.708409766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.500061 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.500086 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.500172 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: W1202 23:00:01.502566 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8309b769_d214_4620_a891_ae8ae36630cd.slice/crio-1f705248195e244c110e0bdb35c26440b3191b6a2ad2a5086c13c7a087616f2f WatchSource:0}: Error finding container 1f705248195e244c110e0bdb35c26440b3191b6a2ad2a5086c13c7a087616f2f: Status 404 returned error can't find the container with id 1f705248195e244c110e0bdb35c26440b3191b6a2ad2a5086c13c7a087616f2f Dec 02 23:00:01 crc kubenswrapper[4903]: W1202 23:00:01.517673 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec790a4a_c562_4035_ba10_9ac0c8baf6c6.slice/crio-c08cac14a47eec24a1abc1822a6b5a7429896ab4d2297d9b8573661ffd39908e WatchSource:0}: Error finding container c08cac14a47eec24a1abc1822a6b5a7429896ab4d2297d9b8573661ffd39908e: Status 404 returned error can't find the container with id c08cac14a47eec24a1abc1822a6b5a7429896ab4d2297d9b8573661ffd39908e Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601323 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.601548 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.101514262 +0000 UTC m=+140.810068555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601857 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601893 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7c261e-3e52-4595-be2a-23f9f79fa197-config\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601913 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601930 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601948 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3af2ceb-1e23-4887-b647-06b9e0466f1a-proxy-tls\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.601967 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-serving-cert\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602001 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6c199ad-3068-41c9-b8df-8f6b889a8db8-tmpfs\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602029 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fv9f\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602049 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-srv-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602064 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0554732d-fd39-49f9-a98a-8daab6de9795-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602081 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602099 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrk6j\" (UniqueName: \"kubernetes.io/projected/c3af2ceb-1e23-4887-b647-06b9e0466f1a-kube-api-access-wrk6j\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602187 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602222 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/40098fa6-978c-4d83-920e-ca922d2fbefb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602247 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-cabundle\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602297 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7c261e-3e52-4595-be2a-23f9f79fa197-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602322 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-etcd-client\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.602343 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.102331173 +0000 UTC m=+140.810885456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602375 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6x9\" (UniqueName: \"kubernetes.io/projected/84d43c75-0f92-433c-a555-6d854b8ff0c4-kube-api-access-7v6x9\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602409 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602428 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqjc\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-kube-api-access-dkqjc\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602494 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-config\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602538 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-service-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602580 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/029452cc-db9a-4761-b35d-17e7b11d6f84-trusted-ca\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602599 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602669 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602694 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hwx\" (UniqueName: \"kubernetes.io/projected/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-kube-api-access-x6hwx\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602715 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602730 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602771 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcpcz\" (UniqueName: \"kubernetes.io/projected/d6c199ad-3068-41c9-b8df-8f6b889a8db8-kube-api-access-wcpcz\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602788 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-key\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602821 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029452cc-db9a-4761-b35d-17e7b11d6f84-metrics-tls\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602846 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7c261e-3e52-4595-be2a-23f9f79fa197-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602863 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3af2ceb-1e23-4887-b647-06b9e0466f1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602883 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4mm\" (UniqueName: \"kubernetes.io/projected/40098fa6-978c-4d83-920e-ca922d2fbefb-kube-api-access-hd4mm\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602900 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602937 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-webhook-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602964 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.602983 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz67d\" (UniqueName: \"kubernetes.io/projected/0554732d-fd39-49f9-a98a-8daab6de9795-kube-api-access-dz67d\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.603000 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2lw2\" (UniqueName: \"kubernetes.io/projected/f6375f12-a03e-438e-96da-76db81f20764-kube-api-access-x2lw2\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.603540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.604372 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.606071 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.611700 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.613706 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.625169 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.628366 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fv9f\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704373 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704616 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704647 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2342a1a-f6f4-4b83-8de3-444e3b51642c-proxy-tls\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704697 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637fedff-192e-4220-958e-ee458ad15bf2-serving-cert\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704744 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hwx\" (UniqueName: \"kubernetes.io/projected/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-kube-api-access-x6hwx\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704766 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-registration-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704799 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.704881 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7e55ca-a590-4992-8402-337ef0a42dcc-config-volume\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705045 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-certs\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705082 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcpcz\" (UniqueName: \"kubernetes.io/projected/d6c199ad-3068-41c9-b8df-8f6b889a8db8-kube-api-access-wcpcz\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705105 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-key\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705183 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029452cc-db9a-4761-b35d-17e7b11d6f84-metrics-tls\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705237 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rt5\" (UniqueName: \"kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705282 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7c261e-3e52-4595-be2a-23f9f79fa197-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705315 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c69b7653-79c2-4182-b7b2-26aef84a054d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.705405 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.205375848 +0000 UTC m=+140.913930181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705441 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3af2ceb-1e23-4887-b647-06b9e0466f1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705467 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-mountpoint-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705501 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4mm\" (UniqueName: \"kubernetes.io/projected/40098fa6-978c-4d83-920e-ca922d2fbefb-kube-api-access-hd4mm\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7rd\" (UniqueName: \"kubernetes.io/projected/3f7e55ca-a590-4992-8402-337ef0a42dcc-kube-api-access-hs7rd\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705594 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69b7653-79c2-4182-b7b2-26aef84a054d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705632 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.705676 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-webhook-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.707785 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3af2ceb-1e23-4887-b647-06b9e0466f1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.710947 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz67d\" (UniqueName: \"kubernetes.io/projected/0554732d-fd39-49f9-a98a-8daab6de9795-kube-api-access-dz67d\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.710990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2lw2\" (UniqueName: \"kubernetes.io/projected/f6375f12-a03e-438e-96da-76db81f20764-kube-api-access-x2lw2\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711529 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7e55ca-a590-4992-8402-337ef0a42dcc-metrics-tls\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-plugins-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711666 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711726 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7c261e-3e52-4595-be2a-23f9f79fa197-config\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711750 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711864 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711905 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711929 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711954 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3af2ceb-1e23-4887-b647-06b9e0466f1a-proxy-tls\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.711990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-serving-cert\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.712125 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2fm4\" (UniqueName: \"kubernetes.io/projected/c69b7653-79c2-4182-b7b2-26aef84a054d-kube-api-access-j2fm4\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.712160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6c199ad-3068-41c9-b8df-8f6b889a8db8-tmpfs\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.713211 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7c261e-3e52-4595-be2a-23f9f79fa197-config\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.713238 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-webhook-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.714532 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.719018 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8wq\" (UniqueName: \"kubernetes.io/projected/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-kube-api-access-kc8wq\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.719067 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bsk8\" (UniqueName: \"kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722038 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-csi-data-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722085 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4vj\" (UniqueName: \"kubernetes.io/projected/c186ee58-2d5d-4c2e-aa90-912192529da9-kube-api-access-hl4vj\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722112 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcx2r\" (UniqueName: \"kubernetes.io/projected/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-kube-api-access-jcx2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.722585 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.222561352 +0000 UTC m=+140.931115755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722945 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-srv-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.722996 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0554732d-fd39-49f9-a98a-8daab6de9795-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.723024 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.723485 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/029452cc-db9a-4761-b35d-17e7b11d6f84-metrics-tls\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.724330 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3af2ceb-1e23-4887-b647-06b9e0466f1a-proxy-tls\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.724792 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4mm\" (UniqueName: \"kubernetes.io/projected/40098fa6-978c-4d83-920e-ca922d2fbefb-kube-api-access-hd4mm\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.729281 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.729356 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrk6j\" (UniqueName: \"kubernetes.io/projected/c3af2ceb-1e23-4887-b647-06b9e0466f1a-kube-api-access-wrk6j\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.729422 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-node-bootstrap-token\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730444 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-socket-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730564 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/40098fa6-978c-4d83-920e-ca922d2fbefb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730594 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnczl\" (UniqueName: \"kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-cabundle\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730735 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730761 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730810 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7c261e-3e52-4595-be2a-23f9f79fa197-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730835 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730875 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c186ee58-2d5d-4c2e-aa90-912192529da9-cert\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730937 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2lw2\" (UniqueName: \"kubernetes.io/projected/f6375f12-a03e-438e-96da-76db81f20764-kube-api-access-x2lw2\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730953 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.730996 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6x9\" (UniqueName: \"kubernetes.io/projected/84d43c75-0f92-433c-a555-6d854b8ff0c4-kube-api-access-7v6x9\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.731019 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-etcd-client\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.731525 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvnf\" (UniqueName: \"kubernetes.io/projected/b2342a1a-f6f4-4b83-8de3-444e3b51642c-kube-api-access-rcvnf\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.731611 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.731665 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d5c817-d25d-4a5e-8878-8080aaafa956-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.731963 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7c261e-3e52-4595-be2a-23f9f79fa197-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732344 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732420 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732455 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58g4q\" (UniqueName: \"kubernetes.io/projected/66aef343-d026-4ecd-94dc-2575070f7edc-kube-api-access-58g4q\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732477 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqjc\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-kube-api-access-dkqjc\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732494 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3d5c817-d25d-4a5e-8878-8080aaafa956-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732554 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-config\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732588 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wrwp\" (UniqueName: \"kubernetes.io/projected/bfcecdcd-dc1a-4dbb-9225-2bd922502fcf-kube-api-access-9wrwp\") pod \"migrator-59844c95c7-svc7d\" (UID: \"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732744 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637fedff-192e-4220-958e-ee458ad15bf2-config\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732799 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-images\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732823 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732865 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.732994 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-service-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733020 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733043 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733081 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733125 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrgt\" (UniqueName: \"kubernetes.io/projected/637fedff-192e-4220-958e-ee458ad15bf2-kube-api-access-zxrgt\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733151 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/029452cc-db9a-4761-b35d-17e7b11d6f84-trusted-ca\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733170 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.733204 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d5c817-d25d-4a5e-8878-8080aaafa956-config\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.734116 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.734382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-config\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.734962 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz67d\" (UniqueName: \"kubernetes.io/projected/0554732d-fd39-49f9-a98a-8daab6de9795-kube-api-access-dz67d\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.735099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.736093 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6375f12-a03e-438e-96da-76db81f20764-etcd-service-ca\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.736438 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/029452cc-db9a-4761-b35d-17e7b11d6f84-trusted-ca\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.736628 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hwx\" (UniqueName: \"kubernetes.io/projected/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-kube-api-access-x6hwx\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.737101 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/40098fa6-978c-4d83-920e-ca922d2fbefb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zh8xk\" (UID: \"40098fa6-978c-4d83-920e-ca922d2fbefb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.740189 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d7c261e-3e52-4595-be2a-23f9f79fa197-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-blqhv\" (UID: \"4d7c261e-3e52-4595-be2a-23f9f79fa197\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.748777 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrk6j\" (UniqueName: \"kubernetes.io/projected/c3af2ceb-1e23-4887-b647-06b9e0466f1a-kube-api-access-wrk6j\") pod \"machine-config-controller-84d6567774-tbhzh\" (UID: \"c3af2ceb-1e23-4887-b647-06b9e0466f1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.753048 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c0a08a9-3732-4ba8-b716-3d3e3781e2ca-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9gdf\" (UID: \"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.768950 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0554732d-fd39-49f9-a98a-8daab6de9795-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgdbg\" (UID: \"0554732d-fd39-49f9-a98a-8daab6de9795\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.770141 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec-srv-cert\") pod \"olm-operator-6b444d44fb-sw8gk\" (UID: \"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.833944 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834201 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834222 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834248 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834271 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrgt\" (UniqueName: \"kubernetes.io/projected/637fedff-192e-4220-958e-ee458ad15bf2-kube-api-access-zxrgt\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d5c817-d25d-4a5e-8878-8080aaafa956-config\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834302 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2342a1a-f6f4-4b83-8de3-444e3b51642c-proxy-tls\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834347 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637fedff-192e-4220-958e-ee458ad15bf2-serving-cert\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834383 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-registration-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834400 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7e55ca-a590-4992-8402-337ef0a42dcc-config-volume\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834441 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-certs\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834479 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834498 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rt5\" (UniqueName: \"kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-mountpoint-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834540 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c69b7653-79c2-4182-b7b2-26aef84a054d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834564 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7rd\" (UniqueName: \"kubernetes.io/projected/3f7e55ca-a590-4992-8402-337ef0a42dcc-kube-api-access-hs7rd\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834584 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69b7653-79c2-4182-b7b2-26aef84a054d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834612 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7e55ca-a590-4992-8402-337ef0a42dcc-metrics-tls\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834631 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-plugins-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834678 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.834990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2fm4\" (UniqueName: \"kubernetes.io/projected/c69b7653-79c2-4182-b7b2-26aef84a054d-kube-api-access-j2fm4\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835117 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8wq\" (UniqueName: \"kubernetes.io/projected/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-kube-api-access-kc8wq\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835221 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bsk8\" (UniqueName: \"kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835249 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-csi-data-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4vj\" (UniqueName: \"kubernetes.io/projected/c186ee58-2d5d-4c2e-aa90-912192529da9-kube-api-access-hl4vj\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835298 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835324 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcx2r\" (UniqueName: \"kubernetes.io/projected/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-kube-api-access-jcx2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835346 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-socket-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835361 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-node-bootstrap-token\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835376 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnczl\" (UniqueName: \"kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835399 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835414 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835429 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835446 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835462 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c186ee58-2d5d-4c2e-aa90-912192529da9-cert\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835477 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835521 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvnf\" (UniqueName: \"kubernetes.io/projected/b2342a1a-f6f4-4b83-8de3-444e3b51642c-kube-api-access-rcvnf\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835539 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d5c817-d25d-4a5e-8878-8080aaafa956-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835572 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58g4q\" (UniqueName: \"kubernetes.io/projected/66aef343-d026-4ecd-94dc-2575070f7edc-kube-api-access-58g4q\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835591 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3d5c817-d25d-4a5e-8878-8080aaafa956-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835622 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wrwp\" (UniqueName: \"kubernetes.io/projected/bfcecdcd-dc1a-4dbb-9225-2bd922502fcf-kube-api-access-9wrwp\") pod \"migrator-59844c95c7-svc7d\" (UID: \"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835639 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637fedff-192e-4220-958e-ee458ad15bf2-config\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835694 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-images\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835711 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835726 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.835744 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.837575 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.837664 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.337635061 +0000 UTC m=+141.046189344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.837693 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.838258 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-csi-data-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.838953 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-socket-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.842987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.843377 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-plugins-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.843879 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d5c817-d25d-4a5e-8878-8080aaafa956-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.844345 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-images\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.844921 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637fedff-192e-4220-958e-ee458ad15bf2-config\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.844962 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-registration-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.845310 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-mountpoint-dir\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.847580 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c69b7653-79c2-4182-b7b2-26aef84a054d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.848307 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c69b7653-79c2-4182-b7b2-26aef84a054d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.851271 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7e55ca-a590-4992-8402-337ef0a42dcc-metrics-tls\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.852955 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.853683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.854050 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2342a1a-f6f4-4b83-8de3-444e3b51642c-proxy-tls\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.854584 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.854583 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.855148 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.856017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.856105 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637fedff-192e-4220-958e-ee458ad15bf2-serving-cert\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.855744 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7e55ca-a590-4992-8402-337ef0a42dcc-config-volume\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.856303 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.856593 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b2342a1a-f6f4-4b83-8de3-444e3b51642c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.857679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c186ee58-2d5d-4c2e-aa90-912192529da9-cert\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.858070 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.862870 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcx2r\" (UniqueName: \"kubernetes.io/projected/ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc-kube-api-access-jcx2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-gdr6w\" (UID: \"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.862969 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvnf\" (UniqueName: \"kubernetes.io/projected/b2342a1a-f6f4-4b83-8de3-444e3b51642c-kube-api-access-rcvnf\") pod \"machine-config-operator-74547568cd-s2ftx\" (UID: \"b2342a1a-f6f4-4b83-8de3-444e3b51642c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.864085 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4vj\" (UniqueName: \"kubernetes.io/projected/c186ee58-2d5d-4c2e-aa90-912192529da9-kube-api-access-hl4vj\") pod \"ingress-canary-m4d4c\" (UID: \"c186ee58-2d5d-4c2e-aa90-912192529da9\") " pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.866353 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnczl\" (UniqueName: \"kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl\") pod \"collect-profiles-29411940-vlzpv\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.880245 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.886263 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7rd\" (UniqueName: \"kubernetes.io/projected/3f7e55ca-a590-4992-8402-337ef0a42dcc-kube-api-access-hs7rd\") pod \"dns-default-wwsfb\" (UID: \"3f7e55ca-a590-4992-8402-337ef0a42dcc\") " pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.890261 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bsk8\" (UniqueName: \"kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8\") pod \"marketplace-operator-79b997595-vvxf5\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.892926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.907953 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.910360 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rt5\" (UniqueName: \"kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.910668 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wrwp\" (UniqueName: \"kubernetes.io/projected/bfcecdcd-dc1a-4dbb-9225-2bd922502fcf-kube-api-access-9wrwp\") pod \"migrator-59844c95c7-svc7d\" (UID: \"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.911390 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-node-bootstrap-token\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.912086 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6c199ad-3068-41c9-b8df-8f6b889a8db8-tmpfs\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.915051 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-key\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.917749 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66aef343-d026-4ecd-94dc-2575070f7edc-certs\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.918603 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.920318 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.920515 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.924889 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58g4q\" (UniqueName: \"kubernetes.io/projected/66aef343-d026-4ecd-94dc-2575070f7edc-kube-api-access-58g4q\") pod \"machine-config-server-hhjwp\" (UID: \"66aef343-d026-4ecd-94dc-2575070f7edc\") " pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.925957 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b3d5c817-d25d-4a5e-8878-8080aaafa956-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.926308 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6c199ad-3068-41c9-b8df-8f6b889a8db8-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.926909 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84d43c75-0f92-433c-a555-6d854b8ff0c4-signing-cabundle\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.929943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.930353 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6x9\" (UniqueName: \"kubernetes.io/projected/84d43c75-0f92-433c-a555-6d854b8ff0c4-kube-api-access-7v6x9\") pod \"service-ca-9c57cc56f-9spjq\" (UID: \"84d43c75-0f92-433c-a555-6d854b8ff0c4\") " pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.931103 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d5c817-d25d-4a5e-8878-8080aaafa956-config\") pod \"kube-apiserver-operator-766d6c64bb-cttnd\" (UID: \"b3d5c817-d25d-4a5e-8878-8080aaafa956\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.931313 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrgt\" (UniqueName: \"kubernetes.io/projected/637fedff-192e-4220-958e-ee458ad15bf2-kube-api-access-zxrgt\") pod \"service-ca-operator-777779d784-lsg46\" (UID: \"637fedff-192e-4220-958e-ee458ad15bf2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.931720 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcpcz\" (UniqueName: \"kubernetes.io/projected/d6c199ad-3068-41c9-b8df-8f6b889a8db8-kube-api-access-wcpcz\") pod \"packageserver-d55dfcdfc-tzc84\" (UID: \"d6c199ad-3068-41c9-b8df-8f6b889a8db8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.931770 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.932268 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2fm4\" (UniqueName: \"kubernetes.io/projected/c69b7653-79c2-4182-b7b2-26aef84a054d-kube-api-access-j2fm4\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7pd8\" (UID: \"c69b7653-79c2-4182-b7b2-26aef84a054d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.939286 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:01 crc kubenswrapper[4903]: E1202 23:00:01.939731 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.43971308 +0000 UTC m=+141.148267363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.940209 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.941266 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqjc\" (UniqueName: \"kubernetes.io/projected/029452cc-db9a-4761-b35d-17e7b11d6f84-kube-api-access-dkqjc\") pod \"ingress-operator-5b745b69d9-lq9hn\" (UID: \"029452cc-db9a-4761-b35d-17e7b11d6f84\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.941739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-serving-cert\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.941984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6375f12-a03e-438e-96da-76db81f20764-etcd-client\") pod \"etcd-operator-b45778765-xkmcq\" (UID: \"f6375f12-a03e-438e-96da-76db81f20764\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.943156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.943468 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.944136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5ltff\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.944137 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8wq\" (UniqueName: \"kubernetes.io/projected/0ca6d54e-b1e8-4be3-8690-7be6ebd279a9-kube-api-access-kc8wq\") pod \"csi-hostpathplugin-n67tl\" (UID: \"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9\") " pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.965796 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.990028 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" Dec 02 23:00:01 crc kubenswrapper[4903]: I1202 23:00:01.992410 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.006005 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.013101 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.017815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.022995 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" event={"ID":"ec790a4a-c562-4035-ba10-9ac0c8baf6c6","Type":"ContainerStarted","Data":"c08cac14a47eec24a1abc1822a6b5a7429896ab4d2297d9b8573661ffd39908e"} Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.023241 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.039853 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.040733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pkxw5" event={"ID":"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745","Type":"ContainerStarted","Data":"86a63502dd568220f5a524ede314d8021a64a5f34e6ad008ff35ddb3b93f2153"} Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.040968 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.041348 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.541323619 +0000 UTC m=+141.249877902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.044532 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.54452107 +0000 UTC m=+141.253075353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.044618 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.058106 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.063046 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.069218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" event={"ID":"8309b769-d214-4620-a891-ae8ae36630cd","Type":"ContainerStarted","Data":"1f705248195e244c110e0bdb35c26440b3191b6a2ad2a5086c13c7a087616f2f"} Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.069531 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.081505 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.085882 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.086949 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.089787 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.090069 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" event={"ID":"929c87de-7dc4-40bf-9fe9-85229a13dca1","Type":"ContainerStarted","Data":"f7ce19ad3f6d28c5d2ad07fb43277cfc8aeb08e1d6eba5ae81a5288ed92207e6"} Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.090931 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.097761 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.112475 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4d4c" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.124984 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hhjwp" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.147393 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.148085 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.648056987 +0000 UTC m=+141.356611270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.218541 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.227357 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.230233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.249819 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.250755 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.750740993 +0000 UTC m=+141.459295276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: W1202 23:00:02.266168 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c4b26c_f83a_45c7_92e9_da974b729b51.slice/crio-fdfeae958852bdb46f231fefd473b821e3095ee8a6372f942dbd66642c356a3c WatchSource:0}: Error finding container fdfeae958852bdb46f231fefd473b821e3095ee8a6372f942dbd66642c356a3c: Status 404 returned error can't find the container with id fdfeae958852bdb46f231fefd473b821e3095ee8a6372f942dbd66642c356a3c Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.347268 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.355150 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.355585 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.855563941 +0000 UTC m=+141.564118224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: W1202 23:00:02.441554 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd131fd77_f36a_4c9a_8578_5b2c62e5d356.slice/crio-be6e2815e91f92633317158e5e37dbb6d4a4a6f7662e148ecbdd15bb5eb50caa WatchSource:0}: Error finding container be6e2815e91f92633317158e5e37dbb6d4a4a6f7662e148ecbdd15bb5eb50caa: Status 404 returned error can't find the container with id be6e2815e91f92633317158e5e37dbb6d4a4a6f7662e148ecbdd15bb5eb50caa Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.457224 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.457588 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:02.9575765 +0000 UTC m=+141.666130783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.500259 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wdkzh"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.537978 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.554109 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.557975 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.558354 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.058330817 +0000 UTC m=+141.766885090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.572964 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.638336 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.645242 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411925-h29nj"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.663171 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.663622 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.163608678 +0000 UTC m=+141.872162961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.728091 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s6vbc"] Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.765227 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.765412 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.26537861 +0000 UTC m=+141.973932893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.765454 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.765953 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.265946075 +0000 UTC m=+141.974500358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.870227 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.870343 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.370312053 +0000 UTC m=+142.078866336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.870529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.871105 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.371090273 +0000 UTC m=+142.079644556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:02 crc kubenswrapper[4903]: I1202 23:00:02.971392 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:02 crc kubenswrapper[4903]: E1202 23:00:02.971813 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.471791648 +0000 UTC m=+142.180345931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.072381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.072702 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.572687598 +0000 UTC m=+142.281241881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.098834 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" event={"ID":"2e426530-7180-403a-9810-6612e19b1110","Type":"ContainerStarted","Data":"72bf2d060853494054c30f35d681b44595a53e7038604bd0597273ba36b92ed6"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.101235 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" event={"ID":"896cc150-3871-46e2-b1f5-c31c25c54014","Type":"ContainerStarted","Data":"6fe2a4ab72f9039606d0642d08e6686b3942fdb7a81766cb7b30b3f3532dbea4"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.103044 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.110291 4903 generic.go:334] "Generic (PLEG): container finished" podID="f2fec0cf-8c59-4ec1-ae18-91b0081c60bb" containerID="1f88b5426c43467742fd0d90c45909db8a2a0b1e57eeabd0b586f0cef97b4980" exitCode=0 Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.110576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" event={"ID":"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb","Type":"ContainerDied","Data":"1f88b5426c43467742fd0d90c45909db8a2a0b1e57eeabd0b586f0cef97b4980"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.110639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" event={"ID":"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb","Type":"ContainerStarted","Data":"b143882dc2b540cc0ba3d7434e7fc5c9fe85e8fc7b5710b7874e36789f18101d"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.115794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" event={"ID":"c964b5fc-f02f-481c-abd3-2f567b9e98e6","Type":"ContainerStarted","Data":"e698189089a19a7b001c7c841a1f7579af8ae5b201ab33244d9e1e03156f0cb7"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.120239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tjt6x" event={"ID":"d131fd77-f36a-4c9a-8578-5b2c62e5d356","Type":"ContainerStarted","Data":"be6e2815e91f92633317158e5e37dbb6d4a4a6f7662e148ecbdd15bb5eb50caa"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.120429 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" podStartSLOduration=122.120397964 podStartE2EDuration="2m2.120397964s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.117738417 +0000 UTC m=+141.826292700" watchObservedRunningTime="2025-12-02 23:00:03.120397964 +0000 UTC m=+141.828952247" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.139479 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" event={"ID":"a8d2e09c-a22c-4581-830c-ad25ff946f4a","Type":"ContainerStarted","Data":"7669a91eab50a9323bbf1f56d9f7cd96d77bc39cbb68a19fdbb7e22aa7c94e96"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.148693 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" event={"ID":"2c5439fc-734f-4efa-838f-68900d9453ec","Type":"ContainerStarted","Data":"99882b84c258285cb6dbb70a2017e661d6b30f59ae5662a7db17d88fa193b212"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.149177 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.152594 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hhjwp" event={"ID":"66aef343-d026-4ecd-94dc-2575070f7edc","Type":"ContainerStarted","Data":"2cf6f0da7511e7886cf3ca18d6151111dde0a9f96bb78c1bff85e2095da49f21"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.156480 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.159318 4903 generic.go:334] "Generic (PLEG): container finished" podID="8309b769-d214-4620-a891-ae8ae36630cd" containerID="b279288bfd454b164398e489bd7cacf73ea76fe7a467e717623f81850869bffe" exitCode=0 Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.159419 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" event={"ID":"8309b769-d214-4620-a891-ae8ae36630cd","Type":"ContainerDied","Data":"b279288bfd454b164398e489bd7cacf73ea76fe7a467e717623f81850869bffe"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.166038 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" podStartSLOduration=122.166019517 podStartE2EDuration="2m2.166019517s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.161985795 +0000 UTC m=+141.870540078" watchObservedRunningTime="2025-12-02 23:00:03.166019517 +0000 UTC m=+141.874573800" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.173142 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.173887 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.673870555 +0000 UTC m=+142.382424838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.175360 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zzfgm" event={"ID":"7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5","Type":"ContainerStarted","Data":"2cd678706f5764567d5e195f1e4447705635272bc65425d9c0f6f0b2c91eb43c"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.175676 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.177986 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pkxw5" event={"ID":"d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745","Type":"ContainerStarted","Data":"a00a77cc36bdda135e7e671b8caf2b5756ce0a30789c1ed311f4f6005243ae67"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.182642 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-zzfgm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.182711 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zzfgm" podUID="7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.199915 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" event={"ID":"e1c4b26c-f83a-45c7-92e9-da974b729b51","Type":"ContainerStarted","Data":"e77841e0f04e5b4ddc64a86669f54255ea45e5b938e613d704b13e4099be3376"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.200059 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" event={"ID":"e1c4b26c-f83a-45c7-92e9-da974b729b51","Type":"ContainerStarted","Data":"fdfeae958852bdb46f231fefd473b821e3095ee8a6372f942dbd66642c356a3c"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.222115 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.222477 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" event={"ID":"ec790a4a-c562-4035-ba10-9ac0c8baf6c6","Type":"ContainerStarted","Data":"897ea576191cbcf63fcd1952462eeab4367e3f87e59833b2657ee150dc0123f7"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.222564 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" event={"ID":"ec790a4a-c562-4035-ba10-9ac0c8baf6c6","Type":"ContainerStarted","Data":"51a9193ad2176d1a936e9d963c9b88b681a2d9ab1a575b61a721466eb4b541a5"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.237297 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" event={"ID":"da60db71-3425-41be-8c51-99e7326f559f","Type":"ContainerStarted","Data":"00dda8c9c2595f5eb79d2873f8a6828354bdf98ca810d2172605562aa4b93bcb"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.258634 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" event={"ID":"d2d0b975-a970-41c7-a703-d2771bc3fcc8","Type":"ContainerStarted","Data":"1bdd827d9bc886b9a4ab9812a7be7f194e216d8fc6458781ea01bc90b138065a"} Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.259046 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.277308 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.280115 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.780101541 +0000 UTC m=+142.488655824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.324772 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.333752 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.361192 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwh9f" podStartSLOduration=123.36117745 podStartE2EDuration="2m3.36117745s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.359330323 +0000 UTC m=+142.067884606" watchObservedRunningTime="2025-12-02 23:00:03.36117745 +0000 UTC m=+142.069731733" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.385236 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvnkb" podStartSLOduration=122.385217948 podStartE2EDuration="2m2.385217948s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.384697354 +0000 UTC m=+142.093251647" watchObservedRunningTime="2025-12-02 23:00:03.385217948 +0000 UTC m=+142.093772231" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.387482 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.390753 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:03.890728867 +0000 UTC m=+142.599283150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.425988 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" podStartSLOduration=122.425961337 podStartE2EDuration="2m2.425961337s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.40191934 +0000 UTC m=+142.110473623" watchObservedRunningTime="2025-12-02 23:00:03.425961337 +0000 UTC m=+142.134515620" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.433305 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zzfgm" podStartSLOduration=122.433280482 podStartE2EDuration="2m2.433280482s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.426150742 +0000 UTC m=+142.134705035" watchObservedRunningTime="2025-12-02 23:00:03.433280482 +0000 UTC m=+142.141834765" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.456671 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.480939 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffb4r"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.484504 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pkxw5" podStartSLOduration=122.484486467 podStartE2EDuration="2m2.484486467s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:03.483819989 +0000 UTC m=+142.192374272" watchObservedRunningTime="2025-12-02 23:00:03.484486467 +0000 UTC m=+142.193040750" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.487731 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgdbg"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.494199 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.494279 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kvzf7" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.503840 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.504304 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.004284407 +0000 UTC m=+142.712838690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.516310 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:03 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:03 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:03 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.516364 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.593253 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.604482 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.604935 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.104914721 +0000 UTC m=+142.813469004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.653227 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a385a5a-8a80-4d4d-8d00-b2543dfaf3fc" path="/var/lib/kubelet/pods/9a385a5a-8a80-4d4d-8d00-b2543dfaf3fc/volumes" Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.654584 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.705759 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.706100 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.206086768 +0000 UTC m=+142.914641051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.728822 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.736676 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.788359 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.790607 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.808264 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.808483 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.308443845 +0000 UTC m=+143.016998138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.837893 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lsg46"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.856106 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.896571 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx"] Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.919639 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:03 crc kubenswrapper[4903]: E1202 23:00:03.920055 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.420042456 +0000 UTC m=+143.128596729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.928953 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwsfb"] Dec 02 23:00:03 crc kubenswrapper[4903]: W1202 23:00:03.936087 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea93257f_fe2e_4062_9b5d_3cd6f53f6fdc.slice/crio-bf74d500f74edd1dbc077124db9aa2f7746138db4d52e87c29ed2af77d46da6f WatchSource:0}: Error finding container bf74d500f74edd1dbc077124db9aa2f7746138db4d52e87c29ed2af77d46da6f: Status 404 returned error can't find the container with id bf74d500f74edd1dbc077124db9aa2f7746138db4d52e87c29ed2af77d46da6f Dec 02 23:00:03 crc kubenswrapper[4903]: I1202 23:00:03.938675 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 23:00:03 crc kubenswrapper[4903]: W1202 23:00:03.997273 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0a08a9_3732_4ba8_b716_3d3e3781e2ca.slice/crio-ccd568a078d2262c39bde7cf78d2f01dab50ee540d1bb77735d463667035a2a5 WatchSource:0}: Error finding container ccd568a078d2262c39bde7cf78d2f01dab50ee540d1bb77735d463667035a2a5: Status 404 returned error can't find the container with id ccd568a078d2262c39bde7cf78d2f01dab50ee540d1bb77735d463667035a2a5 Dec 02 23:00:04 crc kubenswrapper[4903]: W1202 23:00:04.002388 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5aaf2f_377b_4d66_b9b6_671f831e0af1.slice/crio-af14b9d2f0819e8119454193efda106b0e52add0cbdbb4e93bf86a92c007f4c2 WatchSource:0}: Error finding container af14b9d2f0819e8119454193efda106b0e52add0cbdbb4e93bf86a92c007f4c2: Status 404 returned error can't find the container with id af14b9d2f0819e8119454193efda106b0e52add0cbdbb4e93bf86a92c007f4c2 Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.021193 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.021538 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.52151889 +0000 UTC m=+143.230073173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: W1202 23:00:04.030416 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7e55ca_a590_4992_8402_337ef0a42dcc.slice/crio-27bb9347ae381644d4217a7bb0d920810f0525aa92147e117d69e04e3b26ba14 WatchSource:0}: Error finding container 27bb9347ae381644d4217a7bb0d920810f0525aa92147e117d69e04e3b26ba14: Status 404 returned error can't find the container with id 27bb9347ae381644d4217a7bb0d920810f0525aa92147e117d69e04e3b26ba14 Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.049934 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.083621 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xkmcq"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.104451 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4d4c"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.139629 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.140009 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.639990825 +0000 UTC m=+143.348545108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.151875 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8"] Dec 02 23:00:04 crc kubenswrapper[4903]: W1202 23:00:04.178027 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6375f12_a03e_438e_96da_76db81f20764.slice/crio-838988d027bf9ea2060c1749402759b6a9d596d25b269307b7d425f0cf4d82d7 WatchSource:0}: Error finding container 838988d027bf9ea2060c1749402759b6a9d596d25b269307b7d425f0cf4d82d7: Status 404 returned error can't find the container with id 838988d027bf9ea2060c1749402759b6a9d596d25b269307b7d425f0cf4d82d7 Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.203522 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.214486 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n67tl"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.242162 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.242957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.243428 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.743407629 +0000 UTC m=+143.451961912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.297626 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9spjq"] Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.330373 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" event={"ID":"245513eb-a40a-4eff-80ed-c7070eb94f8a","Type":"ContainerStarted","Data":"41f0d57edff515f64f0f8f5cab2540f7987bdb8caceb95ff8266340117dee931"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.330438 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" event={"ID":"245513eb-a40a-4eff-80ed-c7070eb94f8a","Type":"ContainerStarted","Data":"40b0dc93d41b654a228d6386f50e3551c905f383a4ea48081dfa0c03978e81b9"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.335054 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" event={"ID":"2e426530-7180-403a-9810-6612e19b1110","Type":"ContainerStarted","Data":"12e9d872ec6e3b478a0308c152e730898ad571d0f0a36e1808921b5842fa222f"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.339427 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" event={"ID":"a8d2e09c-a22c-4581-830c-ad25ff946f4a","Type":"ContainerStarted","Data":"57d15d480afa3af3bbd88a09178c3919b9fad911c9aa564e8808f8d1cb2de73c"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.341116 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" event={"ID":"8309b769-d214-4620-a891-ae8ae36630cd","Type":"ContainerStarted","Data":"8af9cec3c256df959edcd12acf0bf70d48caa21b3c50df7e6dae28410ebe8cfc"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.348010 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.348386 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.848368372 +0000 UTC m=+143.556922655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.350712 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-s6vbc" podStartSLOduration=124.35069776 podStartE2EDuration="2m4.35069776s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.349166652 +0000 UTC m=+143.057720955" watchObservedRunningTime="2025-12-02 23:00:04.35069776 +0000 UTC m=+143.059252043" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.351335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" event={"ID":"da60db71-3425-41be-8c51-99e7326f559f","Type":"ContainerStarted","Data":"bcb248729b216a899932115849ccadc825cf182198a1aee3f0e119c03825f242"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.364236 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" event={"ID":"637fedff-192e-4220-958e-ee458ad15bf2","Type":"ContainerStarted","Data":"5a10b78fbf0a089ecc5290577e44d791e60c286f10eefff3ed9984f89478b256"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.385036 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" podStartSLOduration=123.385016138 podStartE2EDuration="2m3.385016138s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.379396626 +0000 UTC m=+143.087950909" watchObservedRunningTime="2025-12-02 23:00:04.385016138 +0000 UTC m=+143.093570421" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.404529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" event={"ID":"e2656003-d0f9-4d65-8744-1e394226a359","Type":"ContainerStarted","Data":"07cbd7cbb4f9a104bdf85016972f1f867b6b1b5b9f804604a485ab60c009bb4e"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.404709 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" event={"ID":"e2656003-d0f9-4d65-8744-1e394226a359","Type":"ContainerStarted","Data":"5e9d7aa94d126872ef0279f8a5ddbdb006e563d158b9a64d3ac915e40d218aaa"} Dec 02 23:00:04 crc kubenswrapper[4903]: W1202 23:00:04.424084 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69b7653_79c2_4182_b7b2_26aef84a054d.slice/crio-064b91e33ca1a90b4ec1a11f6c39402edf9684ee11d6e39cb2c9807edeafa095 WatchSource:0}: Error finding container 064b91e33ca1a90b4ec1a11f6c39402edf9684ee11d6e39cb2c9807edeafa095: Status 404 returned error can't find the container with id 064b91e33ca1a90b4ec1a11f6c39402edf9684ee11d6e39cb2c9807edeafa095 Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.429721 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" event={"ID":"b2342a1a-f6f4-4b83-8de3-444e3b51642c","Type":"ContainerStarted","Data":"c90cb454ba9acbb67cd59ab609aa1d927727088a7351c9dfca01d0e685cf5b3d"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.448049 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" event={"ID":"d2d0b975-a970-41c7-a703-d2771bc3fcc8","Type":"ContainerStarted","Data":"171e38b96c3f8af75dfc7fe20794e80f925e7c4076f35d744beaa2b9ee189ec7"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.449035 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.449443 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.449814 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7hl6" podStartSLOduration=123.449798666 podStartE2EDuration="2m3.449798666s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.403069095 +0000 UTC m=+143.111623378" watchObservedRunningTime="2025-12-02 23:00:04.449798666 +0000 UTC m=+143.158352949" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.450145 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m4vjn" podStartSLOduration=125.450142224 podStartE2EDuration="2m5.450142224s" podCreationTimestamp="2025-12-02 22:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.44799528 +0000 UTC m=+143.156549563" watchObservedRunningTime="2025-12-02 23:00:04.450142224 +0000 UTC m=+143.158696507" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.450957 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:04.950936384 +0000 UTC m=+143.659490667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.470492 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" event={"ID":"029452cc-db9a-4761-b35d-17e7b11d6f84","Type":"ContainerStarted","Data":"e8982e9468cfe0fa66420f756a3ab6c8488037f813d32e5fc051f14538ddaf4b"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.477495 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.479373 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" event={"ID":"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca","Type":"ContainerStarted","Data":"ccd568a078d2262c39bde7cf78d2f01dab50ee540d1bb77735d463667035a2a5"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.483977 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wvzlq" podStartSLOduration=123.48395657 podStartE2EDuration="2m3.48395657s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.482270856 +0000 UTC m=+143.190825139" watchObservedRunningTime="2025-12-02 23:00:04.48395657 +0000 UTC m=+143.192510853" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.502406 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" event={"ID":"f2fec0cf-8c59-4ec1-ae18-91b0081c60bb","Type":"ContainerStarted","Data":"7c73249d07f18ae075d46b6c278ee27f5b578474a0d66163cc9a9b045f745d7c"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.504312 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:04 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:04 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:04 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.504496 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.505742 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.510754 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-flrnk" podStartSLOduration=123.510736146 podStartE2EDuration="2m3.510736146s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.510727596 +0000 UTC m=+143.219281879" watchObservedRunningTime="2025-12-02 23:00:04.510736146 +0000 UTC m=+143.219290419" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.518833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" event={"ID":"c964b5fc-f02f-481c-abd3-2f567b9e98e6","Type":"ContainerStarted","Data":"74dc01f11578d992cc53e5d28d109393deec9967b76072b7a0dd15ad456be704"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.551299 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.552973 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.052953363 +0000 UTC m=+143.761507646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.575163 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hhjwp" event={"ID":"66aef343-d026-4ecd-94dc-2575070f7edc","Type":"ContainerStarted","Data":"7e5acfe2e81a0556b02f16906e3e4cbc294b5cd0a2585ad2021d97f3a6c3bab4"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.584377 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" event={"ID":"c3af2ceb-1e23-4887-b647-06b9e0466f1a","Type":"ContainerStarted","Data":"e0b6456ec5d8b4d47bcd8fe448611b373b0bd9250b42647a56360c3f49523f97"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.604405 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" event={"ID":"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf","Type":"ContainerStarted","Data":"02dcd88e273d7b787948ddda4cbd1ef89a6133e78c05d96cd6ab2640cd96ed37"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.610515 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" podStartSLOduration=123.610498408 podStartE2EDuration="2m3.610498408s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.548429699 +0000 UTC m=+143.256983992" watchObservedRunningTime="2025-12-02 23:00:04.610498408 +0000 UTC m=+143.319052691" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.649061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" event={"ID":"b3d5c817-d25d-4a5e-8878-8080aaafa956","Type":"ContainerStarted","Data":"ad6e397cae95f03ba64cc11823799c4849854c22203a0bcf7cb690f529e205fc"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.655093 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.656506 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.15648251 +0000 UTC m=+143.865036783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.657314 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" event={"ID":"4b25a40b-8bba-423f-b3fa-5b58e3d18423","Type":"ContainerStarted","Data":"2e94ce108d3d9702a07135e7120271834496b55dae4f58b23fe8159975faf6e3"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.667870 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" event={"ID":"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc","Type":"ContainerStarted","Data":"bf74d500f74edd1dbc077124db9aa2f7746138db4d52e87c29ed2af77d46da6f"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.675103 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwsfb" event={"ID":"3f7e55ca-a590-4992-8402-337ef0a42dcc","Type":"ContainerStarted","Data":"27bb9347ae381644d4217a7bb0d920810f0525aa92147e117d69e04e3b26ba14"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.685522 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" event={"ID":"46bb090b-9216-49d5-91d7-43cf3ee3bf4a","Type":"ContainerStarted","Data":"2f8e449a276a047be2fa81c107e25201f080a89739fcd206aba53fbaaed7df9b"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.702637 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" event={"ID":"8c5aaf2f-377b-4d66-b9b6-671f831e0af1","Type":"ContainerStarted","Data":"af14b9d2f0819e8119454193efda106b0e52add0cbdbb4e93bf86a92c007f4c2"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.711818 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hhjwp" podStartSLOduration=7.711792018 podStartE2EDuration="7.711792018s" podCreationTimestamp="2025-12-02 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.61217037 +0000 UTC m=+143.320724673" watchObservedRunningTime="2025-12-02 23:00:04.711792018 +0000 UTC m=+143.420346301" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.713369 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tjt6x" event={"ID":"d131fd77-f36a-4c9a-8578-5b2c62e5d356","Type":"ContainerStarted","Data":"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.731131 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" event={"ID":"0554732d-fd39-49f9-a98a-8daab6de9795","Type":"ContainerStarted","Data":"44ff00e47b67da6f61a4bed7d97dbfdaa6e9eff82b17aa0eb73238294fd79c3e"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.751106 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" event={"ID":"40098fa6-978c-4d83-920e-ca922d2fbefb","Type":"ContainerStarted","Data":"79ce099bc73f91ca04afbab9fd73384ff20ebf1749cef79f8080925e93995294"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.756873 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.758437 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.258421557 +0000 UTC m=+143.966975840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.773886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" event={"ID":"f6375f12-a03e-438e-96da-76db81f20764","Type":"ContainerStarted","Data":"838988d027bf9ea2060c1749402759b6a9d596d25b269307b7d425f0cf4d82d7"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.789574 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" event={"ID":"4d7c261e-3e52-4595-be2a-23f9f79fa197","Type":"ContainerStarted","Data":"fd9e646852d442db8eea3cd69d3a400128ea99d239d2586168822138abced99f"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.789669 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" event={"ID":"4d7c261e-3e52-4595-be2a-23f9f79fa197","Type":"ContainerStarted","Data":"e5b1c450cf3772ab87df34ccc58d8303911cafdc2dd563bfeb8d6a0833265cf0"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.853381 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-blqhv" podStartSLOduration=123.853360166 podStartE2EDuration="2m3.853360166s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.851751945 +0000 UTC m=+143.560306218" watchObservedRunningTime="2025-12-02 23:00:04.853360166 +0000 UTC m=+143.561914449" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.853998 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tjt6x" podStartSLOduration=123.853992872 podStartE2EDuration="2m3.853992872s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.744102564 +0000 UTC m=+143.452656847" watchObservedRunningTime="2025-12-02 23:00:04.853992872 +0000 UTC m=+143.562547155" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.862490 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.863890 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.363865272 +0000 UTC m=+144.072419545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.869491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" event={"ID":"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec","Type":"ContainerStarted","Data":"d6e77cbf2f0b4e160fb02a43b73b0435a5ad7c7bde3718a931defadd2b040ddc"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.869540 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" event={"ID":"79b12cf8-01f4-4c3d-93ba-52ba4eaff6ec","Type":"ContainerStarted","Data":"904a8db8a5de9030a06bf2d3db18b09beb3747e9cc2dcc177db9de7540e5eb15"} Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.873782 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.882980 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-zzfgm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.883070 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zzfgm" podUID="7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.910326 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" podStartSLOduration=123.910302055 podStartE2EDuration="2m3.910302055s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:04.908251534 +0000 UTC m=+143.616805817" watchObservedRunningTime="2025-12-02 23:00:04.910302055 +0000 UTC m=+143.618856338" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.963089 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw8gk" Dec 02 23:00:04 crc kubenswrapper[4903]: I1202 23:00:04.965294 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:04 crc kubenswrapper[4903]: E1202 23:00:04.965634 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.465620164 +0000 UTC m=+144.174174447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.070124 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.070333 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.570298539 +0000 UTC m=+144.278852822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.070603 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.071077 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.571062089 +0000 UTC m=+144.279616372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.178316 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.178999 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.678980977 +0000 UTC m=+144.387535260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.280351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.280797 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.78078275 +0000 UTC m=+144.489337033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.383574 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.384019 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.884001158 +0000 UTC m=+144.592555441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.384355 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.486463 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.492464 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:05.987508295 +0000 UTC m=+144.696062578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.506693 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:05 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:05 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:05 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.506767 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.596421 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.596958 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.096940321 +0000 UTC m=+144.805494604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.703836 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.704256 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.204237273 +0000 UTC m=+144.912791556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.807464 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.808199 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.30817533 +0000 UTC m=+145.016729613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.892710 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" event={"ID":"a8d2e09c-a22c-4581-830c-ad25ff946f4a","Type":"ContainerStarted","Data":"246ec97db86cdc704d3872471e32ac8a89ec2881e4d85cd6208ddd21b3d62ebc"} Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.912347 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:05 crc kubenswrapper[4903]: E1202 23:00:05.912759 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.412742643 +0000 UTC m=+145.121296926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.922952 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" event={"ID":"1c0a08a9-3732-4ba8-b716-3d3e3781e2ca","Type":"ContainerStarted","Data":"0729a21392db06f0c8818684461c062dce915f2fa2767ef2234ab31e6b45239d"} Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.961886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" event={"ID":"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf","Type":"ContainerStarted","Data":"8c727601937422e85d88e2111b1ea509b5f4a5eeba35a581c6d3da54191de945"} Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.961954 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" event={"ID":"bfcecdcd-dc1a-4dbb-9225-2bd922502fcf","Type":"ContainerStarted","Data":"6318a5e7a1d64af2d465bce017d9099031452223a93715545503b8992efb840e"} Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.975234 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kjsgz" podStartSLOduration=124.975203722 podStartE2EDuration="2m4.975203722s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:05.932206355 +0000 UTC m=+144.640760638" watchObservedRunningTime="2025-12-02 23:00:05.975203722 +0000 UTC m=+144.683758015" Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.981665 4903 generic.go:334] "Generic (PLEG): container finished" podID="46bb090b-9216-49d5-91d7-43cf3ee3bf4a" containerID="cba84842bc10db1388ecf6fab20338c4402907ebe611d8c8c5172c63298bd065" exitCode=0 Dec 02 23:00:05 crc kubenswrapper[4903]: I1202 23:00:05.983532 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" event={"ID":"46bb090b-9216-49d5-91d7-43cf3ee3bf4a","Type":"ContainerDied","Data":"cba84842bc10db1388ecf6fab20338c4402907ebe611d8c8c5172c63298bd065"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.000146 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9gdf" podStartSLOduration=125.000120021 podStartE2EDuration="2m5.000120021s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:05.994599503 +0000 UTC m=+144.703153786" watchObservedRunningTime="2025-12-02 23:00:06.000120021 +0000 UTC m=+144.708674314" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.029938 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.031467 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.531448164 +0000 UTC m=+145.240002447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.034944 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.047745 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.053023 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.056672 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" event={"ID":"d6c199ad-3068-41c9-b8df-8f6b889a8db8","Type":"ContainerStarted","Data":"e16ea2e07df4465da2e134660e0121e291a79b2126bfe9594f68a9ca2b1cc2b6"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.057458 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.060930 4903 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzc84 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.060981 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" podUID="d6c199ad-3068-41c9-b8df-8f6b889a8db8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.062988 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.078885 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" event={"ID":"029452cc-db9a-4761-b35d-17e7b11d6f84","Type":"ContainerStarted","Data":"ffe7a56c5ce0de5f01ffa2355b276abcd8dc95e733557b250903b96d05373384"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.095466 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" event={"ID":"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9","Type":"ContainerStarted","Data":"6c833953fac377630cfa4445ede38c91fb60ce5a6b61678c770f63b92c78a451"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.132526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.132702 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.132742 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.132818 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6fx\" (UniqueName: \"kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.140200 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-svc7d" podStartSLOduration=125.140178271 podStartE2EDuration="2m5.140178271s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.122606648 +0000 UTC m=+144.831160931" watchObservedRunningTime="2025-12-02 23:00:06.140178271 +0000 UTC m=+144.848732554" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.141380 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.641366842 +0000 UTC m=+145.349921205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.198488 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" event={"ID":"b3d5c817-d25d-4a5e-8878-8080aaafa956","Type":"ContainerStarted","Data":"ebd96979a4fa019f5a3d81bd01d75733610348bc8e610775674c6d12424225b8"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.238143 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.238408 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.238475 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.238529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6fx\" (UniqueName: \"kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.238942 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.738921618 +0000 UTC m=+145.447475901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.239293 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.239540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.240955 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" podStartSLOduration=125.240938588 podStartE2EDuration="2m5.240938588s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.170780065 +0000 UTC m=+144.879334348" watchObservedRunningTime="2025-12-02 23:00:06.240938588 +0000 UTC m=+144.949492871" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.246896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4d4c" event={"ID":"c186ee58-2d5d-4c2e-aa90-912192529da9","Type":"ContainerStarted","Data":"ec7f337f45ee408d32e759b874352d0a6c231040de547702991e58b92f11cfa1"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.246947 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4d4c" event={"ID":"c186ee58-2d5d-4c2e-aa90-912192529da9","Type":"ContainerStarted","Data":"4bd05fdb209e551f6f2165361a73ef96e83f5f17235bbee2217ba0b39ea3eb69"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.288241 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6fx\" (UniqueName: \"kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx\") pod \"certified-operators-krzz4\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.300906 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" event={"ID":"c3af2ceb-1e23-4887-b647-06b9e0466f1a","Type":"ContainerStarted","Data":"71f57fe0d9582d2e2690565ff59f9ce3521ee1d3d438b80b5601f20a78931641"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.340055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.344043 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" event={"ID":"b2342a1a-f6f4-4b83-8de3-444e3b51642c","Type":"ContainerStarted","Data":"38a8138eb4f9a26790f4ea3ddf5aaf6275c19a20818633591504639bb6c851ae"} Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.345839 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.845823349 +0000 UTC m=+145.554377632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.353426 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" event={"ID":"0554732d-fd39-49f9-a98a-8daab6de9795","Type":"ContainerStarted","Data":"9f0183ce134c5f71b09886279a7fda8460a3df918e8fb9d11a89b361a318b818"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.355992 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cttnd" podStartSLOduration=125.355979157 podStartE2EDuration="2m5.355979157s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.310010405 +0000 UTC m=+145.018564688" watchObservedRunningTime="2025-12-02 23:00:06.355979157 +0000 UTC m=+145.064533440" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.390834 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" event={"ID":"dd1b463e-15a8-425d-872e-d1f9683747c2","Type":"ContainerStarted","Data":"6f081c20905982f479963eebae3e8a9cad72aace1dd769dfa88e3bf606f82a7d"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.390884 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" event={"ID":"dd1b463e-15a8-425d-872e-d1f9683747c2","Type":"ContainerStarted","Data":"e294fa2b8057e6dc8fa9cb2a6fc6be5ec9241db486727fcf48e26500304adb9e"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.406562 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.425127 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" podStartSLOduration=125.425107484 podStartE2EDuration="2m5.425107484s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.356546481 +0000 UTC m=+145.065100764" watchObservedRunningTime="2025-12-02 23:00:06.425107484 +0000 UTC m=+145.133661777" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.426092 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m4d4c" podStartSLOduration=9.426085659 podStartE2EDuration="9.426085659s" podCreationTimestamp="2025-12-02 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.424262592 +0000 UTC m=+145.132816875" watchObservedRunningTime="2025-12-02 23:00:06.426085659 +0000 UTC m=+145.134639942" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.440808 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.442092 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:06.942069292 +0000 UTC m=+145.650623575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.442947 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" event={"ID":"c964b5fc-f02f-481c-abd3-2f567b9e98e6","Type":"ContainerStarted","Data":"f706fbe28e740dc002d8c467e8d4dd03090549bc545475e44c181759e610224c"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.447734 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.449173 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.455277 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" podStartSLOduration=6.455256256 podStartE2EDuration="6.455256256s" podCreationTimestamp="2025-12-02 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.441676083 +0000 UTC m=+145.150230366" watchObservedRunningTime="2025-12-02 23:00:06.455256256 +0000 UTC m=+145.163810539" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.455383 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.470663 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" event={"ID":"4b25a40b-8bba-423f-b3fa-5b58e3d18423","Type":"ContainerStarted","Data":"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.471252 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.481185 4903 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vvxf5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.481231 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.482478 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" podStartSLOduration=125.482464854 podStartE2EDuration="2m5.482464854s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.481335865 +0000 UTC m=+145.189890178" watchObservedRunningTime="2025-12-02 23:00:06.482464854 +0000 UTC m=+145.191019137" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.488215 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" event={"ID":"84d43c75-0f92-433c-a555-6d854b8ff0c4","Type":"ContainerStarted","Data":"239a3bd33b1d2c9aed5b3b5966a0ae4523c36f71832ed54240fb4e5742a0d178"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.488261 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" event={"ID":"84d43c75-0f92-433c-a555-6d854b8ff0c4","Type":"ContainerStarted","Data":"e5dbe16677183e5e3db8fd007e2edcaeb78c966585937323cfa133d3e71da055"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.498884 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:06 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:06 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:06 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.498944 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.531401 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" event={"ID":"c69b7653-79c2-4182-b7b2-26aef84a054d","Type":"ContainerStarted","Data":"a549be56cc5e2d4155cb7d48cdfdf31dc106ae4952ff2657cde1c42314c3304a"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.531444 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" event={"ID":"c69b7653-79c2-4182-b7b2-26aef84a054d","Type":"ContainerStarted","Data":"064b91e33ca1a90b4ec1a11f6c39402edf9684ee11d6e39cb2c9807edeafa095"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.542337 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.542441 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.542543 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.542575 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjswr\" (UniqueName: \"kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.544405 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.044391069 +0000 UTC m=+145.752945352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.555224 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" event={"ID":"40098fa6-978c-4d83-920e-ca922d2fbefb","Type":"ContainerStarted","Data":"4ab6b878b4004584d082c1714f441496d12442d7918305bc1c7b58f784d81e7d"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.555957 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.557486 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" event={"ID":"ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc","Type":"ContainerStarted","Data":"875842da82e845a01ab14d124484326d5735559a42c50bc9ace868abc6c8e256"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.559440 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" event={"ID":"637fedff-192e-4220-958e-ee458ad15bf2","Type":"ContainerStarted","Data":"25331391fd73122f39405d03f2870afb7356970022a2a531bb3b1f5f3aec08d2"} Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.573893 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wdkzh" podStartSLOduration=125.573872954 podStartE2EDuration="2m5.573872954s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.573068934 +0000 UTC m=+145.281623207" watchObservedRunningTime="2025-12-02 23:00:06.573872954 +0000 UTC m=+145.282427237" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.592681 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jr8l4" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.610820 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" podStartSLOduration=125.610794188 podStartE2EDuration="2m5.610794188s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.610139281 +0000 UTC m=+145.318693564" watchObservedRunningTime="2025-12-02 23:00:06.610794188 +0000 UTC m=+145.319348471" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.635433 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.643556 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.643990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.644786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.644936 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjswr\" (UniqueName: \"kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.645031 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.649898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.650298 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.652798 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.152776249 +0000 UTC m=+145.861330532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.668019 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.700928 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.731530 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9spjq" podStartSLOduration=125.731511399 podStartE2EDuration="2m5.731511399s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.717403812 +0000 UTC m=+145.425958095" watchObservedRunningTime="2025-12-02 23:00:06.731511399 +0000 UTC m=+145.440065682" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.746467 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.747243 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.747340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflkk\" (UniqueName: \"kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.747451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.747944 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.247929684 +0000 UTC m=+145.956483967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.754683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjswr\" (UniqueName: \"kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr\") pod \"certified-operators-gqwqz\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.799949 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.827115 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gdr6w" podStartSLOduration=125.827100275 podStartE2EDuration="2m5.827100275s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.77511023 +0000 UTC m=+145.483664503" watchObservedRunningTime="2025-12-02 23:00:06.827100275 +0000 UTC m=+145.535654548" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.827699 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7pd8" podStartSLOduration=125.82769545 podStartE2EDuration="2m5.82769545s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.825576736 +0000 UTC m=+145.534131019" watchObservedRunningTime="2025-12-02 23:00:06.82769545 +0000 UTC m=+145.536249733" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.858318 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.858529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.858573 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.858592 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kflkk\" (UniqueName: \"kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.858904 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.358886568 +0000 UTC m=+146.067440851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.859131 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.859188 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.859385 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.866273 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.897309 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kflkk\" (UniqueName: \"kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk\") pod \"community-operators-v4mbf\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.913047 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lsg46" podStartSLOduration=125.913025006 podStartE2EDuration="2m5.913025006s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.890915828 +0000 UTC m=+145.599470111" watchObservedRunningTime="2025-12-02 23:00:06.913025006 +0000 UTC m=+145.621579289" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.915990 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.964317 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.964409 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.964462 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.964491 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw225\" (UniqueName: \"kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:06 crc kubenswrapper[4903]: E1202 23:00:06.964824 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.464811025 +0000 UTC m=+146.173365308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:06 crc kubenswrapper[4903]: I1202 23:00:06.982998 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.035420 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" podStartSLOduration=126.03540158 podStartE2EDuration="2m6.03540158s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:06.953582171 +0000 UTC m=+145.662136464" watchObservedRunningTime="2025-12-02 23:00:07.03540158 +0000 UTC m=+145.743955863" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.067239 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.067486 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw225\" (UniqueName: \"kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.067513 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.067571 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.068081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.068163 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.568135147 +0000 UTC m=+146.276689430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.068589 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.102626 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw225\" (UniqueName: \"kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225\") pod \"community-operators-hhwns\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.169593 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.170231 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.670216697 +0000 UTC m=+146.378770980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.231987 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.272190 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.272546 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.772528683 +0000 UTC m=+146.481082966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.375453 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.375798 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.875784613 +0000 UTC m=+146.584338896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.396310 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.476973 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.477940 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:07.977919605 +0000 UTC m=+146.686473888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.503629 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:07 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:07 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:07 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.503704 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.581383 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.581702 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.081690208 +0000 UTC m=+146.790244491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.601955 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" event={"ID":"0554732d-fd39-49f9-a98a-8daab6de9795","Type":"ContainerStarted","Data":"d7cdbb72393503c0aaa5e7857c53f777b20ad3f897d487c8b2e5b64db570d4e2"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.619566 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgdbg" podStartSLOduration=126.619551525 podStartE2EDuration="2m6.619551525s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:07.619113714 +0000 UTC m=+146.327667997" watchObservedRunningTime="2025-12-02 23:00:07.619551525 +0000 UTC m=+146.328105808" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.630207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" event={"ID":"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9","Type":"ContainerStarted","Data":"70e7cc58c637e31df55aa5176f7d920caa0fd783beb39f56ac22dff550c3e76a"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.647096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" event={"ID":"40098fa6-978c-4d83-920e-ca922d2fbefb","Type":"ContainerStarted","Data":"3061b8094613b2e45d83b2c53cf10cd05248b0c916a390febbbb93ec45b78aef"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.660295 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" event={"ID":"46bb090b-9216-49d5-91d7-43cf3ee3bf4a","Type":"ContainerStarted","Data":"28412c9f61e6710ddbda493ba5768f793c43645492aee86b10160e143ace9eac"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.676010 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwsfb" event={"ID":"3f7e55ca-a590-4992-8402-337ef0a42dcc","Type":"ContainerStarted","Data":"e3242e2594bc7f581764884bb46ffd81c1a115786ab172800ec73d1be975b9b8"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.676058 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwsfb" event={"ID":"3f7e55ca-a590-4992-8402-337ef0a42dcc","Type":"ContainerStarted","Data":"09391696027fa28715824af5b0fda9a5035641811a913bc4379686678686f3a7"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.676499 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.683539 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.683795 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.183765998 +0000 UTC m=+146.892320281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.683882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.685258 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.185250455 +0000 UTC m=+146.893804738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.689061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerStarted","Data":"475931a9757dbb63a10d586dd974c6f6197c54d82c62298aede2a168768cf6fc"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.699304 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wwsfb" podStartSLOduration=10.69928757 podStartE2EDuration="10.69928757s" podCreationTimestamp="2025-12-02 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:07.695480783 +0000 UTC m=+146.404035066" watchObservedRunningTime="2025-12-02 23:00:07.69928757 +0000 UTC m=+146.407841853" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.699370 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.711817 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbhzh" event={"ID":"c3af2ceb-1e23-4887-b647-06b9e0466f1a","Type":"ContainerStarted","Data":"d7e9cee49774126b93b7dedc44610850a7844a4b54825ca9a2394da93a332f90"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.725521 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2ftx" event={"ID":"b2342a1a-f6f4-4b83-8de3-444e3b51642c","Type":"ContainerStarted","Data":"3a4c72df6fe7bc1426e8b3b4fcd259282457afcd5f153c32c6d785eab17537cf"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.726639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" event={"ID":"d6c199ad-3068-41c9-b8df-8f6b889a8db8","Type":"ContainerStarted","Data":"6a4116516adf1a4ba0b67940690602a882dcbd5f497cc89373dc5c59c0febcdb"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.728582 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" event={"ID":"8c5aaf2f-377b-4d66-b9b6-671f831e0af1","Type":"ContainerStarted","Data":"f392956756d3add487eaa6413a6e189916b1799a5222f7d3f771d2c5d96654eb"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.728830 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.732209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" event={"ID":"029452cc-db9a-4761-b35d-17e7b11d6f84","Type":"ContainerStarted","Data":"011e1f42ba1c69955f885f0b49a0e830fc929faffcf78eece606fc845d429e4c"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.735249 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" event={"ID":"f6375f12-a03e-438e-96da-76db81f20764","Type":"ContainerStarted","Data":"cdca297ac645edc7dc82bec1d7aa96d29f642483815c75878065f7cb59997905"} Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.740115 4903 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vvxf5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.740184 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.766793 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" podStartSLOduration=127.766776966 podStartE2EDuration="2m7.766776966s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:07.765056262 +0000 UTC m=+146.473610535" watchObservedRunningTime="2025-12-02 23:00:07.766776966 +0000 UTC m=+146.475331249" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.784687 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.784824 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.784832 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.284806851 +0000 UTC m=+146.993361134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.784957 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.803195 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.303171236 +0000 UTC m=+147.011725509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.824448 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lq9hn" podStartSLOduration=126.824430393 podStartE2EDuration="2m6.824430393s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:07.822298349 +0000 UTC m=+146.530852632" watchObservedRunningTime="2025-12-02 23:00:07.824430393 +0000 UTC m=+146.532984676" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.848416 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xkmcq" podStartSLOduration=126.848397489 podStartE2EDuration="2m6.848397489s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:07.844224004 +0000 UTC m=+146.552778287" watchObservedRunningTime="2025-12-02 23:00:07.848397489 +0000 UTC m=+146.556951772" Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.886168 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.887402 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.387354603 +0000 UTC m=+147.095908876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.922808 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:00:07 crc kubenswrapper[4903]: I1202 23:00:07.991920 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:07 crc kubenswrapper[4903]: E1202 23:00:07.992319 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.492305036 +0000 UTC m=+147.200859319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.092682 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.092865 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.592833747 +0000 UTC m=+147.301388030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.093244 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.093660 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.593628028 +0000 UTC m=+147.302182311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.112245 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.194285 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.194525 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.694488907 +0000 UTC m=+147.403043200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.194747 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.195084 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.695069951 +0000 UTC m=+147.403624234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.282395 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzc84" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.295807 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.296186 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.796165536 +0000 UTC m=+147.504719809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.397493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.397985 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.89796759 +0000 UTC m=+147.606521873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.497575 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:08 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:08 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:08 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.497649 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.498417 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.498602 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.998573053 +0000 UTC m=+147.707127336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.498644 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.499032 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:08.999022504 +0000 UTC m=+147.707576787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.599809 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.600268 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 23:00:09.100248002 +0000 UTC m=+147.808802285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.619690 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.620577 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.625771 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.639302 4903 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.682071 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.701179 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: E1202 23:00:08.701589 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 23:00:09.201568414 +0000 UTC m=+147.910122697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7lfrz" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.701642 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.701715 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.701809 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.701922 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.702033 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.702060 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.702133 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr85h\" (UniqueName: \"kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.702857 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.707484 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.709818 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.714457 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.752052 4903 generic.go:334] "Generic (PLEG): container finished" podID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerID="7dd0ad852920d6338920518e744a0f754fb57810ac79784f4d5e5c4fc324b8ef" exitCode=0 Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.752116 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerDied","Data":"7dd0ad852920d6338920518e744a0f754fb57810ac79784f4d5e5c4fc324b8ef"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.752142 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerStarted","Data":"e293a80b570e1a96c355d73a1b5c8fd93ff5555b320059f5be30c403a8e5173d"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.753573 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.757084 4903 generic.go:334] "Generic (PLEG): container finished" podID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerID="c15e422100636cd0a6bb6d483fc22d66581a58714ed509d29a242511efb87046" exitCode=0 Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.757189 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerDied","Data":"c15e422100636cd0a6bb6d483fc22d66581a58714ed509d29a242511efb87046"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.757222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerStarted","Data":"0d49bb05c93e759460f98a10bd44ce5262426538c276dd13a4f21d2aabb1cbf6"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.760803 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" event={"ID":"46bb090b-9216-49d5-91d7-43cf3ee3bf4a","Type":"ContainerStarted","Data":"c104e180fea702f787410f1c4b0d36a05996601d21efba0809d4005b0af201c4"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.764790 4903 generic.go:334] "Generic (PLEG): container finished" podID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerID="09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65" exitCode=0 Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.764783 4903 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T23:00:08.63932766Z","Handler":null,"Name":""} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.765222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerDied","Data":"09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.769226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" event={"ID":"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9","Type":"ContainerStarted","Data":"b5de022e577dead0aa31f35d7582726f497e2da83534e43529bbb7023ce24398"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.778227 4903 generic.go:334] "Generic (PLEG): container finished" podID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerID="d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604" exitCode=0 Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.778432 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerDied","Data":"d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.778476 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerStarted","Data":"1c593e2ccc4503ae0edd1569e3d5f4af87d725b730ef9671fd516b5ed6a1250e"} Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.782510 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.783498 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.796930 4903 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.796969 4903 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.803173 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.803440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr85h\" (UniqueName: \"kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.803489 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.803539 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.803983 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.804101 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.812276 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.823986 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-zzfgm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.824037 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zzfgm" podUID="7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.826876 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-zzfgm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.826907 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zzfgm" podUID="7c7544e0-8c46-4f13-b8e4-a8aa2071a9f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.836982 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.837502 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.843771 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr85h\" (UniqueName: \"kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h\") pod \"redhat-marketplace-w4ft5\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.852213 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" podStartSLOduration=128.85218824 podStartE2EDuration="2m8.85218824s" podCreationTimestamp="2025-12-02 22:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:08.847545963 +0000 UTC m=+147.556100246" watchObservedRunningTime="2025-12-02 23:00:08.85218824 +0000 UTC m=+147.560742523" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.907371 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.915596 4903 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.915752 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.943431 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:00:08 crc kubenswrapper[4903]: I1202 23:00:08.974800 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7lfrz\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.011166 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.011208 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.021090 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.022057 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.029207 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.030838 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.111614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.112014 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lztrf\" (UniqueName: \"kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.112068 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: W1202 23:00:09.177060 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-6cd7f859751a7c7f932c4215b1ac0f7bf6c9b602157888dcc3fdfe14f4b7042d WatchSource:0}: Error finding container 6cd7f859751a7c7f932c4215b1ac0f7bf6c9b602157888dcc3fdfe14f4b7042d: Status 404 returned error can't find the container with id 6cd7f859751a7c7f932c4215b1ac0f7bf6c9b602157888dcc3fdfe14f4b7042d Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.222130 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lztrf\" (UniqueName: \"kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.222181 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.222250 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.222617 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.222831 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.224897 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.248234 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lztrf\" (UniqueName: \"kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf\") pod \"redhat-marketplace-t9h5w\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: W1202 23:00:09.300307 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-de0c66eef4811cd9741ad0b5433a7199ea07468f03eb2aa089780f32e6c1172c WatchSource:0}: Error finding container de0c66eef4811cd9741ad0b5433a7199ea07468f03eb2aa089780f32e6c1172c: Status 404 returned error can't find the container with id de0c66eef4811cd9741ad0b5433a7199ea07468f03eb2aa089780f32e6c1172c Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.316978 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.357777 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.424034 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.426920 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.433158 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.436153 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.506353 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:09 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:09 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:09 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.506767 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.530161 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknkc\" (UniqueName: \"kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.530244 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.530306 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.610004 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.631476 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknkc\" (UniqueName: \"kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.631530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.631556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.632059 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.632859 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.635180 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.643405 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.644690 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.654213 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.657549 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknkc\" (UniqueName: \"kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc\") pod \"redhat-operators-n8bdg\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.672894 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:00:09 crc kubenswrapper[4903]: W1202 23:00:09.724370 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bbe2304_240f_4e4e_8d89_5d40d3017568.slice/crio-c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc WatchSource:0}: Error finding container c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc: Status 404 returned error can't find the container with id c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.732526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.732566 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.732690 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8flqn\" (UniqueName: \"kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.799374 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c45cb96e736d0540351ae1877dc25bb43c33aff6815a67dbc5c06c101fc52889"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.799430 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"de0c66eef4811cd9741ad0b5433a7199ea07468f03eb2aa089780f32e6c1172c"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.801750 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1acd209cfd0374d173205fdada07eb021551aaa92efbe3bf520a2e9500b18bb2"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.801775 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"673bd867758ec0658c4675979ad5461c20e73557b581e423c89656e7e8d709c6"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.803004 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.803705 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerStarted","Data":"c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.807505 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" event={"ID":"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9","Type":"ContainerStarted","Data":"f8fc6cbc37a7f6437eca6b9a234c2d4777b252662d98cbfb7f97e1148424661c"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.807557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" event={"ID":"0ca6d54e-b1e8-4be3-8690-7be6ebd279a9","Type":"ContainerStarted","Data":"48fc4cc8eff196887f2f088e306dae38858cf61bbcd1ce9c4e3e9fc1d4f091e4"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.813563 4903 generic.go:334] "Generic (PLEG): container finished" podID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerID="a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd" exitCode=0 Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.813634 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerDied","Data":"a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.813677 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerStarted","Data":"14c02f36c3b801109d1919df469ac3176bf476d22e9435b9cf85f69fadc6d218"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.815483 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" event={"ID":"4652215a-081a-4b64-aa40-1508e18e8a15","Type":"ContainerStarted","Data":"2fc0487559c839ac21c835518b962a7ccf91a68ab48ddf7555dbd1fbf2ed2b38"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.824388 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c833074438e7b5189529ae197b5c450ae71e03d36e9b6adc41283f9c6f3ff738"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.824448 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6cd7f859751a7c7f932c4215b1ac0f7bf6c9b602157888dcc3fdfe14f4b7042d"} Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.831534 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lsq8g" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.838797 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.838859 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.838992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8flqn\" (UniqueName: \"kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.839902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.840122 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.883375 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8flqn\" (UniqueName: \"kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn\") pod \"redhat-operators-tqg9j\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.893081 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n67tl" podStartSLOduration=12.89305961 podStartE2EDuration="12.89305961s" podCreationTimestamp="2025-12-02 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:09.890476625 +0000 UTC m=+148.599030908" watchObservedRunningTime="2025-12-02 23:00:09.89305961 +0000 UTC m=+148.601613893" Dec 02 23:00:09 crc kubenswrapper[4903]: I1202 23:00:09.991419 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.189718 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.189782 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.203560 4903 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ffb4r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]log ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]etcd ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/max-in-flight-filter ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 02 23:00:10 crc kubenswrapper[4903]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 02 23:00:10 crc kubenswrapper[4903]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/project.openshift.io-projectcache ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-startinformers ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 02 23:00:10 crc kubenswrapper[4903]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 23:00:10 crc kubenswrapper[4903]: livez check failed Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.203634 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" podUID="46bb090b-9216-49d5-91d7-43cf3ee3bf4a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.313862 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.315144 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.318330 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.325204 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.325881 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.401949 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.404464 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:00:10 crc kubenswrapper[4903]: W1202 23:00:10.431798 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0630099_58c0_41ce_8817_013f7bff4749.slice/crio-1a2c0437c8de03ea66052d0720bc8b8497e556955fb1436924522024c1bef57a WatchSource:0}: Error finding container 1a2c0437c8de03ea66052d0720bc8b8497e556955fb1436924522024c1bef57a: Status 404 returned error can't find the container with id 1a2c0437c8de03ea66052d0720bc8b8497e556955fb1436924522024c1bef57a Dec 02 23:00:10 crc kubenswrapper[4903]: W1202 23:00:10.435830 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a3d9b9_b736_4927_a6b2_9e5f25faf956.slice/crio-99040a33ced1127e1c817e016baa230493ba36065e27e6c627f4bcab15d90735 WatchSource:0}: Error finding container 99040a33ced1127e1c817e016baa230493ba36065e27e6c627f4bcab15d90735: Status 404 returned error can't find the container with id 99040a33ced1127e1c817e016baa230493ba36065e27e6c627f4bcab15d90735 Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.447304 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.447394 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.498953 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:10 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:10 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:10 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.499012 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.549152 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.549232 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.549732 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.574066 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.662952 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.858813 4903 generic.go:334] "Generic (PLEG): container finished" podID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerID="7766d2c7408a8279a7ad97f994093dfec89b584f4494f31b02a3e6506cd24492" exitCode=0 Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.859123 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerDied","Data":"7766d2c7408a8279a7ad97f994093dfec89b584f4494f31b02a3e6506cd24492"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.888282 4903 generic.go:334] "Generic (PLEG): container finished" podID="e0630099-58c0-41ce-8817-013f7bff4749" containerID="c4ab90ab9fbe1c838cab684df58365cbbb7a9fbdc9349ad39af842c2c4e166ab" exitCode=0 Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.888391 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerDied","Data":"c4ab90ab9fbe1c838cab684df58365cbbb7a9fbdc9349ad39af842c2c4e166ab"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.888413 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerStarted","Data":"1a2c0437c8de03ea66052d0720bc8b8497e556955fb1436924522024c1bef57a"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.898018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" event={"ID":"4652215a-081a-4b64-aa40-1508e18e8a15","Type":"ContainerStarted","Data":"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.899437 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.914723 4903 generic.go:334] "Generic (PLEG): container finished" podID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerID="ebe4e22654cf88f759281dd17ed3061ffb0aeda4f43bf865292f6a86b965b83b" exitCode=0 Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.917639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerDied","Data":"ebe4e22654cf88f759281dd17ed3061ffb0aeda4f43bf865292f6a86b965b83b"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.917706 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerStarted","Data":"99040a33ced1127e1c817e016baa230493ba36065e27e6c627f4bcab15d90735"} Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.932724 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 23:00:10 crc kubenswrapper[4903]: I1202 23:00:10.933680 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" podStartSLOduration=129.932491542 podStartE2EDuration="2m9.932491542s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:10.930107623 +0000 UTC m=+149.638661906" watchObservedRunningTime="2025-12-02 23:00:10.932491542 +0000 UTC m=+149.641045825" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.494779 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.495171 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.496104 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.497307 4903 patch_prober.go:28] interesting pod/console-f9d7485db-tjt6x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.497353 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tjt6x" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.499844 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:11 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:11 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:11 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.499870 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.925590 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a01b7051-a365-4645-a249-43deb5b8de75","Type":"ContainerStarted","Data":"974a16da83525ddef6ab7753e595373f9ba868db3614a1cfcf326a1bbb4d8ce6"} Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.925661 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a01b7051-a365-4645-a249-43deb5b8de75","Type":"ContainerStarted","Data":"1cf57ab7acf87037da91fe4f76d0556bc25f88e3c1d69181b3d96657b7283c62"} Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.930081 4903 generic.go:334] "Generic (PLEG): container finished" podID="dd1b463e-15a8-425d-872e-d1f9683747c2" containerID="6f081c20905982f479963eebae3e8a9cad72aace1dd769dfa88e3bf606f82a7d" exitCode=0 Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.930730 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" event={"ID":"dd1b463e-15a8-425d-872e-d1f9683747c2","Type":"ContainerDied","Data":"6f081c20905982f479963eebae3e8a9cad72aace1dd769dfa88e3bf606f82a7d"} Dec 02 23:00:11 crc kubenswrapper[4903]: I1202 23:00:11.939166 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.939143887 podStartE2EDuration="1.939143887s" podCreationTimestamp="2025-12-02 23:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:11.936838739 +0000 UTC m=+150.645393022" watchObservedRunningTime="2025-12-02 23:00:11.939143887 +0000 UTC m=+150.647698170" Dec 02 23:00:12 crc kubenswrapper[4903]: I1202 23:00:12.495598 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:12 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:12 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:12 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:12 crc kubenswrapper[4903]: I1202 23:00:12.495673 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:12 crc kubenswrapper[4903]: I1202 23:00:12.949807 4903 generic.go:334] "Generic (PLEG): container finished" podID="a01b7051-a365-4645-a249-43deb5b8de75" containerID="974a16da83525ddef6ab7753e595373f9ba868db3614a1cfcf326a1bbb4d8ce6" exitCode=0 Dec 02 23:00:12 crc kubenswrapper[4903]: I1202 23:00:12.949898 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a01b7051-a365-4645-a249-43deb5b8de75","Type":"ContainerDied","Data":"974a16da83525ddef6ab7753e595373f9ba868db3614a1cfcf326a1bbb4d8ce6"} Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.186759 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.303820 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume\") pod \"dd1b463e-15a8-425d-872e-d1f9683747c2\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.303914 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnczl\" (UniqueName: \"kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl\") pod \"dd1b463e-15a8-425d-872e-d1f9683747c2\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.303971 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume\") pod \"dd1b463e-15a8-425d-872e-d1f9683747c2\" (UID: \"dd1b463e-15a8-425d-872e-d1f9683747c2\") " Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.306048 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd1b463e-15a8-425d-872e-d1f9683747c2" (UID: "dd1b463e-15a8-425d-872e-d1f9683747c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.319182 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl" (OuterVolumeSpecName: "kube-api-access-jnczl") pod "dd1b463e-15a8-425d-872e-d1f9683747c2" (UID: "dd1b463e-15a8-425d-872e-d1f9683747c2"). InnerVolumeSpecName "kube-api-access-jnczl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.319268 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd1b463e-15a8-425d-872e-d1f9683747c2" (UID: "dd1b463e-15a8-425d-872e-d1f9683747c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.405596 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd1b463e-15a8-425d-872e-d1f9683747c2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.405630 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnczl\" (UniqueName: \"kubernetes.io/projected/dd1b463e-15a8-425d-872e-d1f9683747c2-kube-api-access-jnczl\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.405642 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd1b463e-15a8-425d-872e-d1f9683747c2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.496693 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:13 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:13 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:13 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.496749 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.876500 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 23:00:13 crc kubenswrapper[4903]: E1202 23:00:13.876855 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1b463e-15a8-425d-872e-d1f9683747c2" containerName="collect-profiles" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.876872 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1b463e-15a8-425d-872e-d1f9683747c2" containerName="collect-profiles" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.877014 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1b463e-15a8-425d-872e-d1f9683747c2" containerName="collect-profiles" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.877524 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.877922 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.919932 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.920145 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.970248 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.970969 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv" event={"ID":"dd1b463e-15a8-425d-872e-d1f9683747c2","Type":"ContainerDied","Data":"e294fa2b8057e6dc8fa9cb2a6fc6be5ec9241db486727fcf48e26500304adb9e"} Dec 02 23:00:13 crc kubenswrapper[4903]: I1202 23:00:13.971001 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e294fa2b8057e6dc8fa9cb2a6fc6be5ec9241db486727fcf48e26500304adb9e" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.022714 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.022763 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.123934 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.124362 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.124488 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.141832 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.238811 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.496208 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.496780 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:14 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:14 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:14 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.496829 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.543822 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 23:00:14 crc kubenswrapper[4903]: W1202 23:00:14.559203 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f3b20e4_9763_4c7f_a852_2a96dee4dce2.slice/crio-75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627 WatchSource:0}: Error finding container 75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627: Status 404 returned error can't find the container with id 75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627 Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.636592 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir\") pod \"a01b7051-a365-4645-a249-43deb5b8de75\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.636832 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a01b7051-a365-4645-a249-43deb5b8de75" (UID: "a01b7051-a365-4645-a249-43deb5b8de75"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.637239 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access\") pod \"a01b7051-a365-4645-a249-43deb5b8de75\" (UID: \"a01b7051-a365-4645-a249-43deb5b8de75\") " Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.637603 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a01b7051-a365-4645-a249-43deb5b8de75-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.640829 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a01b7051-a365-4645-a249-43deb5b8de75" (UID: "a01b7051-a365-4645-a249-43deb5b8de75"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.739705 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a01b7051-a365-4645-a249-43deb5b8de75-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.983881 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.984583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a01b7051-a365-4645-a249-43deb5b8de75","Type":"ContainerDied","Data":"1cf57ab7acf87037da91fe4f76d0556bc25f88e3c1d69181b3d96657b7283c62"} Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.984638 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf57ab7acf87037da91fe4f76d0556bc25f88e3c1d69181b3d96657b7283c62" Dec 02 23:00:14 crc kubenswrapper[4903]: I1202 23:00:14.991772 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2f3b20e4-9763-4c7f-a852-2a96dee4dce2","Type":"ContainerStarted","Data":"75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627"} Dec 02 23:00:15 crc kubenswrapper[4903]: I1202 23:00:15.195400 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:15 crc kubenswrapper[4903]: I1202 23:00:15.199452 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ffb4r" Dec 02 23:00:15 crc kubenswrapper[4903]: I1202 23:00:15.496582 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:15 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:15 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:15 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:15 crc kubenswrapper[4903]: I1202 23:00:15.496637 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:16 crc kubenswrapper[4903]: I1202 23:00:16.015165 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2f3b20e4-9763-4c7f-a852-2a96dee4dce2","Type":"ContainerStarted","Data":"b15e672fef2a7ea472024f9d6cc0d090bdd839c26f623bc27b8c2fe8dff83020"} Dec 02 23:00:16 crc kubenswrapper[4903]: I1202 23:00:16.501269 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:16 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:16 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:16 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:16 crc kubenswrapper[4903]: I1202 23:00:16.501324 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:17 crc kubenswrapper[4903]: I1202 23:00:17.021682 4903 generic.go:334] "Generic (PLEG): container finished" podID="2f3b20e4-9763-4c7f-a852-2a96dee4dce2" containerID="b15e672fef2a7ea472024f9d6cc0d090bdd839c26f623bc27b8c2fe8dff83020" exitCode=0 Dec 02 23:00:17 crc kubenswrapper[4903]: I1202 23:00:17.021729 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2f3b20e4-9763-4c7f-a852-2a96dee4dce2","Type":"ContainerDied","Data":"b15e672fef2a7ea472024f9d6cc0d090bdd839c26f623bc27b8c2fe8dff83020"} Dec 02 23:00:17 crc kubenswrapper[4903]: I1202 23:00:17.101326 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wwsfb" Dec 02 23:00:17 crc kubenswrapper[4903]: I1202 23:00:17.497417 4903 patch_prober.go:28] interesting pod/router-default-5444994796-pkxw5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 23:00:17 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Dec 02 23:00:17 crc kubenswrapper[4903]: [+]process-running ok Dec 02 23:00:17 crc kubenswrapper[4903]: healthz check failed Dec 02 23:00:17 crc kubenswrapper[4903]: I1202 23:00:17.497495 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pkxw5" podUID="d6da46c9-ab2f-4e27-9fdd-a1b3ba57e745" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 23:00:18 crc kubenswrapper[4903]: I1202 23:00:18.498447 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:18 crc kubenswrapper[4903]: I1202 23:00:18.501071 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pkxw5" Dec 02 23:00:18 crc kubenswrapper[4903]: I1202 23:00:18.784712 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 23:00:18 crc kubenswrapper[4903]: I1202 23:00:18.837250 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zzfgm" Dec 02 23:00:21 crc kubenswrapper[4903]: I1202 23:00:21.659025 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:21 crc kubenswrapper[4903]: I1202 23:00:21.664973 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.075751 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2f3b20e4-9763-4c7f-a852-2a96dee4dce2","Type":"ContainerDied","Data":"75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627"} Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.075824 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75952b13306209daaa500143b5dda6b54c81b46c8efa160d6196afb5a4372627" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.134788 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.250895 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir\") pod \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.251241 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access\") pod \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\" (UID: \"2f3b20e4-9763-4c7f-a852-2a96dee4dce2\") " Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.251008 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f3b20e4-9763-4c7f-a852-2a96dee4dce2" (UID: "2f3b20e4-9763-4c7f-a852-2a96dee4dce2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.251637 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.260029 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f3b20e4-9763-4c7f-a852-2a96dee4dce2" (UID: "2f3b20e4-9763-4c7f-a852-2a96dee4dce2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:22 crc kubenswrapper[4903]: I1202 23:00:22.353431 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f3b20e4-9763-4c7f-a852-2a96dee4dce2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.070518 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.070620 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.082974 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.165730 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.170233 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7bdaec4-1392-4f87-ba0b-f53c76e47cf4-metrics-certs\") pod \"network-metrics-daemon-8vx6p\" (UID: \"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4\") " pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 23:00:23 crc kubenswrapper[4903]: I1202 23:00:23.198025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vx6p" Dec 02 23:00:29 crc kubenswrapper[4903]: I1202 23:00:29.237940 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:00:41 crc kubenswrapper[4903]: E1202 23:00:41.945060 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 23:00:41 crc kubenswrapper[4903]: E1202 23:00:41.945530 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lztrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-t9h5w_openshift-marketplace(0bbe2304-240f-4e4e-8d89-5d40d3017568): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:41 crc kubenswrapper[4903]: E1202 23:00:41.946828 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-t9h5w" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" Dec 02 23:00:41 crc kubenswrapper[4903]: I1202 23:00:41.947404 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zh8xk" Dec 02 23:00:42 crc kubenswrapper[4903]: E1202 23:00:42.618430 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 23:00:42 crc kubenswrapper[4903]: E1202 23:00:42.618600 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fr85h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-w4ft5_openshift-marketplace(8ef1517f-01d9-44a7-a73c-9ed3d727a8cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:42 crc kubenswrapper[4903]: E1202 23:00:42.619828 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-w4ft5" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" Dec 02 23:00:45 crc kubenswrapper[4903]: E1202 23:00:45.686772 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-w4ft5" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" Dec 02 23:00:45 crc kubenswrapper[4903]: E1202 23:00:45.686985 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-t9h5w" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" Dec 02 23:00:47 crc kubenswrapper[4903]: E1202 23:00:47.585111 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 23:00:47 crc kubenswrapper[4903]: E1202 23:00:47.585297 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pknkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n8bdg_openshift-marketplace(e0630099-58c0-41ce-8817-013f7bff4749): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:47 crc kubenswrapper[4903]: E1202 23:00:47.586501 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n8bdg" podUID="e0630099-58c0-41ce-8817-013f7bff4749" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.878033 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 23:00:47 crc kubenswrapper[4903]: E1202 23:00:47.878341 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01b7051-a365-4645-a249-43deb5b8de75" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.878354 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01b7051-a365-4645-a249-43deb5b8de75" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: E1202 23:00:47.878364 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3b20e4-9763-4c7f-a852-2a96dee4dce2" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.878370 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3b20e4-9763-4c7f-a852-2a96dee4dce2" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.878467 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3b20e4-9763-4c7f-a852-2a96dee4dce2" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.878482 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01b7051-a365-4645-a249-43deb5b8de75" containerName="pruner" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.879050 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.881030 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.881701 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 23:00:47 crc kubenswrapper[4903]: I1202 23:00:47.881748 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.031178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.031443 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.132807 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.132907 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.133208 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.153009 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.249055 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:48 crc kubenswrapper[4903]: I1202 23:00:48.850992 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.487260 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n8bdg" podUID="e0630099-58c0-41ce-8817-013f7bff4749" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.700145 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.700294 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bw225,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hhwns_openshift-marketplace(ca1fbb27-3875-43c7-8013-f22c6112be2b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.701795 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hhwns" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.746274 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.746443 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8flqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tqg9j_openshift-marketplace(c0a3d9b9-b736-4927-a6b2-9e5f25faf956): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.747724 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tqg9j" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.757000 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.757166 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kflkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-v4mbf_openshift-marketplace(80fb141b-48f4-4f70-afd8-78fdb7b3c20c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:49 crc kubenswrapper[4903]: E1202 23:00:49.758422 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-v4mbf" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.054229 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-v4mbf" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.054256 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hhwns" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.061945 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tqg9j" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.136458 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.136771 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs6fx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-krzz4_openshift-marketplace(e4aae2be-27f2-43df-a96d-9d2fbc198a6f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.138619 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-krzz4" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.168684 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.168857 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjswr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gqwqz_openshift-marketplace(cb4f1b53-4bbe-4713-852b-066b5d3fd40c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.171721 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gqwqz" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.469390 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 23:00:51 crc kubenswrapper[4903]: W1202 23:00:51.481449 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode55c6a18_8a14_483a_86d2_3028e43951c4.slice/crio-af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036 WatchSource:0}: Error finding container af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036: Status 404 returned error can't find the container with id af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036 Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.541927 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vx6p"] Dec 02 23:00:51 crc kubenswrapper[4903]: W1202 23:00:51.557218 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7bdaec4_1392_4f87_ba0b_f53c76e47cf4.slice/crio-33e98f7023af0061a5e3fc2d183f25bde6bbbc67bfe8b83925ac6a1d1228d30a WatchSource:0}: Error finding container 33e98f7023af0061a5e3fc2d183f25bde6bbbc67bfe8b83925ac6a1d1228d30a: Status 404 returned error can't find the container with id 33e98f7023af0061a5e3fc2d183f25bde6bbbc67bfe8b83925ac6a1d1228d30a Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.879446 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e55c6a18-8a14-483a-86d2-3028e43951c4","Type":"ContainerStarted","Data":"bd564bb3832d5adbc972dbd10151a1d614cd858a4fb6eda91b08a9a4bda46c13"} Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.879864 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e55c6a18-8a14-483a-86d2-3028e43951c4","Type":"ContainerStarted","Data":"af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036"} Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.882937 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" event={"ID":"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4","Type":"ContainerStarted","Data":"8874164e70271714e11ac27183e25df772842ba60642ae798f448268ec254f97"} Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.884277 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" event={"ID":"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4","Type":"ContainerStarted","Data":"33e98f7023af0061a5e3fc2d183f25bde6bbbc67bfe8b83925ac6a1d1228d30a"} Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.907973 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gqwqz" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" Dec 02 23:00:51 crc kubenswrapper[4903]: E1202 23:00:51.908345 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-krzz4" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" Dec 02 23:00:51 crc kubenswrapper[4903]: I1202 23:00:51.911056 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.9110406399999995 podStartE2EDuration="4.91104064s" podCreationTimestamp="2025-12-02 23:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:51.901049367 +0000 UTC m=+190.609603660" watchObservedRunningTime="2025-12-02 23:00:51.91104064 +0000 UTC m=+190.619594933" Dec 02 23:00:52 crc kubenswrapper[4903]: I1202 23:00:52.892038 4903 generic.go:334] "Generic (PLEG): container finished" podID="e55c6a18-8a14-483a-86d2-3028e43951c4" containerID="bd564bb3832d5adbc972dbd10151a1d614cd858a4fb6eda91b08a9a4bda46c13" exitCode=0 Dec 02 23:00:52 crc kubenswrapper[4903]: I1202 23:00:52.892139 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e55c6a18-8a14-483a-86d2-3028e43951c4","Type":"ContainerDied","Data":"bd564bb3832d5adbc972dbd10151a1d614cd858a4fb6eda91b08a9a4bda46c13"} Dec 02 23:00:52 crc kubenswrapper[4903]: I1202 23:00:52.896865 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vx6p" event={"ID":"e7bdaec4-1392-4f87-ba0b-f53c76e47cf4","Type":"ContainerStarted","Data":"ab652924800f51ebd588a82f4903b5e98591f9ce116464a807be4e12c42afd25"} Dec 02 23:00:52 crc kubenswrapper[4903]: I1202 23:00:52.947739 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8vx6p" podStartSLOduration=171.947712073 podStartE2EDuration="2m51.947712073s" podCreationTimestamp="2025-12-02 22:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:52.945946258 +0000 UTC m=+191.654500611" watchObservedRunningTime="2025-12-02 23:00:52.947712073 +0000 UTC m=+191.656266386" Dec 02 23:00:53 crc kubenswrapper[4903]: I1202 23:00:53.070415 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:00:53 crc kubenswrapper[4903]: I1202 23:00:53.070499 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.311687 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.425488 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir\") pod \"e55c6a18-8a14-483a-86d2-3028e43951c4\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.425605 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e55c6a18-8a14-483a-86d2-3028e43951c4" (UID: "e55c6a18-8a14-483a-86d2-3028e43951c4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.425732 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access\") pod \"e55c6a18-8a14-483a-86d2-3028e43951c4\" (UID: \"e55c6a18-8a14-483a-86d2-3028e43951c4\") " Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.426128 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e55c6a18-8a14-483a-86d2-3028e43951c4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.431485 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e55c6a18-8a14-483a-86d2-3028e43951c4" (UID: "e55c6a18-8a14-483a-86d2-3028e43951c4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.467044 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 23:00:54 crc kubenswrapper[4903]: E1202 23:00:54.467300 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55c6a18-8a14-483a-86d2-3028e43951c4" containerName="pruner" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.467315 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55c6a18-8a14-483a-86d2-3028e43951c4" containerName="pruner" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.467466 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55c6a18-8a14-483a-86d2-3028e43951c4" containerName="pruner" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.468689 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.485717 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.527339 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.527413 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.527450 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.527515 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e55c6a18-8a14-483a-86d2-3028e43951c4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.628991 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.629221 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.629252 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.629158 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.629390 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.645750 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access\") pod \"installer-9-crc\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.831127 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.928306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e55c6a18-8a14-483a-86d2-3028e43951c4","Type":"ContainerDied","Data":"af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036"} Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.928680 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af45ddbcee67ef7417803c7c24c886ffbaeae067f7dd506fc1a6effaf2951036" Dec 02 23:00:54 crc kubenswrapper[4903]: I1202 23:00:54.928473 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 23:00:55 crc kubenswrapper[4903]: I1202 23:00:55.307797 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 23:00:55 crc kubenswrapper[4903]: W1202 23:00:55.319401 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7bed574_70f1_4907_be3c_6cd655b3df0a.slice/crio-1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829 WatchSource:0}: Error finding container 1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829: Status 404 returned error can't find the container with id 1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829 Dec 02 23:00:55 crc kubenswrapper[4903]: I1202 23:00:55.937827 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a7bed574-70f1-4907-be3c-6cd655b3df0a","Type":"ContainerStarted","Data":"3fdce0e356f2da192517d70482023eb0e6c6852d29241c96898bb277c07fb2a4"} Dec 02 23:00:55 crc kubenswrapper[4903]: I1202 23:00:55.938219 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a7bed574-70f1-4907-be3c-6cd655b3df0a","Type":"ContainerStarted","Data":"1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829"} Dec 02 23:00:56 crc kubenswrapper[4903]: I1202 23:00:56.640497 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.6404696210000003 podStartE2EDuration="2.640469621s" podCreationTimestamp="2025-12-02 23:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:00:55.960998426 +0000 UTC m=+194.669552789" watchObservedRunningTime="2025-12-02 23:00:56.640469621 +0000 UTC m=+195.349023934" Dec 02 23:00:57 crc kubenswrapper[4903]: I1202 23:00:57.951961 4903 generic.go:334] "Generic (PLEG): container finished" podID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerID="364b068142605da84ac356f836e9be677ceac01cb878267101dcb31ab8628f79" exitCode=0 Dec 02 23:00:57 crc kubenswrapper[4903]: I1202 23:00:57.952053 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerDied","Data":"364b068142605da84ac356f836e9be677ceac01cb878267101dcb31ab8628f79"} Dec 02 23:00:58 crc kubenswrapper[4903]: I1202 23:00:58.960771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerStarted","Data":"5c6380e6097f544dae72a63f7fb447c5ff490ca5605fb4d7e62b20abfcb6fe31"} Dec 02 23:00:58 crc kubenswrapper[4903]: I1202 23:00:58.996613 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t9h5w" podStartSLOduration=3.519451498 podStartE2EDuration="50.996598204s" podCreationTimestamp="2025-12-02 23:00:08 +0000 UTC" firstStartedPulling="2025-12-02 23:00:10.861429836 +0000 UTC m=+149.569984119" lastFinishedPulling="2025-12-02 23:00:58.338576542 +0000 UTC m=+197.047130825" observedRunningTime="2025-12-02 23:00:58.994532623 +0000 UTC m=+197.703086956" watchObservedRunningTime="2025-12-02 23:00:58.996598204 +0000 UTC m=+197.705152487" Dec 02 23:00:59 crc kubenswrapper[4903]: I1202 23:00:59.094622 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 23:00:59 crc kubenswrapper[4903]: I1202 23:00:59.357888 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:59 crc kubenswrapper[4903]: I1202 23:00:59.358166 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:00:59 crc kubenswrapper[4903]: I1202 23:00:59.974516 4903 generic.go:334] "Generic (PLEG): container finished" podID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerID="0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b" exitCode=0 Dec 02 23:00:59 crc kubenswrapper[4903]: I1202 23:00:59.975162 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerDied","Data":"0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b"} Dec 02 23:01:00 crc kubenswrapper[4903]: I1202 23:01:00.421679 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-t9h5w" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="registry-server" probeResult="failure" output=< Dec 02 23:01:00 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:01:00 crc kubenswrapper[4903]: > Dec 02 23:01:00 crc kubenswrapper[4903]: I1202 23:01:00.984177 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerStarted","Data":"7256137130cfe7d17552bcd557ba6e174ed9c06028778a5d287d4d4a748cbf4d"} Dec 02 23:01:01 crc kubenswrapper[4903]: I1202 23:01:01.991698 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerStarted","Data":"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd"} Dec 02 23:01:01 crc kubenswrapper[4903]: I1202 23:01:01.994121 4903 generic.go:334] "Generic (PLEG): container finished" podID="e0630099-58c0-41ce-8817-013f7bff4749" containerID="7256137130cfe7d17552bcd557ba6e174ed9c06028778a5d287d4d4a748cbf4d" exitCode=0 Dec 02 23:01:01 crc kubenswrapper[4903]: I1202 23:01:01.994183 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerDied","Data":"7256137130cfe7d17552bcd557ba6e174ed9c06028778a5d287d4d4a748cbf4d"} Dec 02 23:01:02 crc kubenswrapper[4903]: I1202 23:01:02.010815 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4ft5" podStartSLOduration=3.457617 podStartE2EDuration="54.010793204s" podCreationTimestamp="2025-12-02 23:00:08 +0000 UTC" firstStartedPulling="2025-12-02 23:00:09.81788501 +0000 UTC m=+148.526439293" lastFinishedPulling="2025-12-02 23:01:00.371061184 +0000 UTC m=+199.079615497" observedRunningTime="2025-12-02 23:01:02.007582782 +0000 UTC m=+200.716137075" watchObservedRunningTime="2025-12-02 23:01:02.010793204 +0000 UTC m=+200.719347527" Dec 02 23:01:04 crc kubenswrapper[4903]: I1202 23:01:04.008301 4903 generic.go:334] "Generic (PLEG): container finished" podID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerID="495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b" exitCode=0 Dec 02 23:01:04 crc kubenswrapper[4903]: I1202 23:01:04.008451 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerDied","Data":"495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b"} Dec 02 23:01:04 crc kubenswrapper[4903]: I1202 23:01:04.029816 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerStarted","Data":"64a4e1096c14f023200f333616062d7e44375a2823d7b9bd32202fe9f928d04c"} Dec 02 23:01:04 crc kubenswrapper[4903]: I1202 23:01:04.068996 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8bdg" podStartSLOduration=3.459725047 podStartE2EDuration="55.068979041s" podCreationTimestamp="2025-12-02 23:00:09 +0000 UTC" firstStartedPulling="2025-12-02 23:00:10.893854766 +0000 UTC m=+149.602409049" lastFinishedPulling="2025-12-02 23:01:02.50310876 +0000 UTC m=+201.211663043" observedRunningTime="2025-12-02 23:01:04.067663477 +0000 UTC m=+202.776217760" watchObservedRunningTime="2025-12-02 23:01:04.068979041 +0000 UTC m=+202.777533324" Dec 02 23:01:05 crc kubenswrapper[4903]: I1202 23:01:05.082854 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerStarted","Data":"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43"} Dec 02 23:01:05 crc kubenswrapper[4903]: I1202 23:01:05.101734 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krzz4" podStartSLOduration=3.41594298 podStartE2EDuration="59.101715778s" podCreationTimestamp="2025-12-02 23:00:06 +0000 UTC" firstStartedPulling="2025-12-02 23:00:08.766098015 +0000 UTC m=+147.474652288" lastFinishedPulling="2025-12-02 23:01:04.451870803 +0000 UTC m=+203.160425086" observedRunningTime="2025-12-02 23:01:05.09908183 +0000 UTC m=+203.807636123" watchObservedRunningTime="2025-12-02 23:01:05.101715778 +0000 UTC m=+203.810270071" Dec 02 23:01:06 crc kubenswrapper[4903]: I1202 23:01:06.088695 4903 generic.go:334] "Generic (PLEG): container finished" podID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerID="76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185" exitCode=0 Dec 02 23:01:06 crc kubenswrapper[4903]: I1202 23:01:06.088798 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerDied","Data":"76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185"} Dec 02 23:01:06 crc kubenswrapper[4903]: I1202 23:01:06.407435 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:01:06 crc kubenswrapper[4903]: I1202 23:01:06.407774 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:01:06 crc kubenswrapper[4903]: I1202 23:01:06.947367 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.115485 4903 generic.go:334] "Generic (PLEG): container finished" podID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerID="05d9d15e643a8ca27044cf4664efe9878fd0f32b30aaa6f4f92a9974efbbda3b" exitCode=0 Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.115760 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerDied","Data":"05d9d15e643a8ca27044cf4664efe9878fd0f32b30aaa6f4f92a9974efbbda3b"} Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.118865 4903 generic.go:334] "Generic (PLEG): container finished" podID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerID="b1d5bee7e5d0b2883b4b7d3cf61bdc519d64b7d6a5c342f6acc1505a22b2350a" exitCode=0 Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.118919 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerDied","Data":"b1d5bee7e5d0b2883b4b7d3cf61bdc519d64b7d6a5c342f6acc1505a22b2350a"} Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.122605 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerStarted","Data":"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1"} Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.125468 4903 generic.go:334] "Generic (PLEG): container finished" podID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerID="6323b3f251eb6d6f74e1770bcaefc7650abdeaee439260ce0eddccf2f3504f71" exitCode=0 Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.125501 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerDied","Data":"6323b3f251eb6d6f74e1770bcaefc7650abdeaee439260ce0eddccf2f3504f71"} Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.183999 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v4mbf" podStartSLOduration=3.9824311740000002 podStartE2EDuration="1m2.183974139s" podCreationTimestamp="2025-12-02 23:00:06 +0000 UTC" firstStartedPulling="2025-12-02 23:00:08.781888244 +0000 UTC m=+147.490442527" lastFinishedPulling="2025-12-02 23:01:06.983431209 +0000 UTC m=+205.691985492" observedRunningTime="2025-12-02 23:01:08.181472495 +0000 UTC m=+206.890026778" watchObservedRunningTime="2025-12-02 23:01:08.183974139 +0000 UTC m=+206.892528442" Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.944858 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.945177 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:01:08 crc kubenswrapper[4903]: I1202 23:01:08.981854 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.132853 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerStarted","Data":"044fc400eaf6f8c4eb28800440500fa7cd23294a7660de2f27f52fd20759fb0f"} Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.135532 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerStarted","Data":"17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13"} Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.137710 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerStarted","Data":"fd8762c71e1620bf875e218fca6a6a386f627398136b1c8914415946b5b8810f"} Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.164364 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhwns" podStartSLOduration=3.431178594 podStartE2EDuration="1m3.16434933s" podCreationTimestamp="2025-12-02 23:00:06 +0000 UTC" firstStartedPulling="2025-12-02 23:00:08.758432881 +0000 UTC m=+147.466987164" lastFinishedPulling="2025-12-02 23:01:08.491603607 +0000 UTC m=+207.200157900" observedRunningTime="2025-12-02 23:01:09.162105403 +0000 UTC m=+207.870659686" watchObservedRunningTime="2025-12-02 23:01:09.16434933 +0000 UTC m=+207.872903613" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.185429 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqwqz" podStartSLOduration=3.432776128 podStartE2EDuration="1m3.185409811s" podCreationTimestamp="2025-12-02 23:00:06 +0000 UTC" firstStartedPulling="2025-12-02 23:00:08.753304452 +0000 UTC m=+147.461858735" lastFinishedPulling="2025-12-02 23:01:08.505938125 +0000 UTC m=+207.214492418" observedRunningTime="2025-12-02 23:01:09.181860101 +0000 UTC m=+207.890414384" watchObservedRunningTime="2025-12-02 23:01:09.185409811 +0000 UTC m=+207.893964094" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.187796 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.206316 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tqg9j" podStartSLOduration=2.626048221 podStartE2EDuration="1m0.206296908s" podCreationTimestamp="2025-12-02 23:00:09 +0000 UTC" firstStartedPulling="2025-12-02 23:00:10.929914277 +0000 UTC m=+149.638468550" lastFinishedPulling="2025-12-02 23:01:08.510162954 +0000 UTC m=+207.218717237" observedRunningTime="2025-12-02 23:01:09.203573748 +0000 UTC m=+207.912128031" watchObservedRunningTime="2025-12-02 23:01:09.206296908 +0000 UTC m=+207.914851191" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.401704 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.434210 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.803394 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.803450 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.844694 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.993031 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:09 crc kubenswrapper[4903]: I1202 23:01:09.993071 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:10 crc kubenswrapper[4903]: I1202 23:01:10.183914 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:01:11 crc kubenswrapper[4903]: I1202 23:01:11.033135 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tqg9j" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="registry-server" probeResult="failure" output=< Dec 02 23:01:11 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:01:11 crc kubenswrapper[4903]: > Dec 02 23:01:12 crc kubenswrapper[4903]: I1202 23:01:12.385725 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:01:12 crc kubenswrapper[4903]: I1202 23:01:12.385939 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t9h5w" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="registry-server" containerID="cri-o://5c6380e6097f544dae72a63f7fb447c5ff490ca5605fb4d7e62b20abfcb6fe31" gracePeriod=2 Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.174789 4903 generic.go:334] "Generic (PLEG): container finished" podID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerID="5c6380e6097f544dae72a63f7fb447c5ff490ca5605fb4d7e62b20abfcb6fe31" exitCode=0 Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.174882 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerDied","Data":"5c6380e6097f544dae72a63f7fb447c5ff490ca5605fb4d7e62b20abfcb6fe31"} Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.175135 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9h5w" event={"ID":"0bbe2304-240f-4e4e-8d89-5d40d3017568","Type":"ContainerDied","Data":"c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc"} Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.175154 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8920c8d985ab5ebbd7b00a33493abf4357e46686dcfc12a59f01fdafcdef5dc" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.189378 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.287528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content\") pod \"0bbe2304-240f-4e4e-8d89-5d40d3017568\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.287577 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lztrf\" (UniqueName: \"kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf\") pod \"0bbe2304-240f-4e4e-8d89-5d40d3017568\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.287636 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities\") pod \"0bbe2304-240f-4e4e-8d89-5d40d3017568\" (UID: \"0bbe2304-240f-4e4e-8d89-5d40d3017568\") " Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.288825 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities" (OuterVolumeSpecName: "utilities") pod "0bbe2304-240f-4e4e-8d89-5d40d3017568" (UID: "0bbe2304-240f-4e4e-8d89-5d40d3017568"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.293720 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf" (OuterVolumeSpecName: "kube-api-access-lztrf") pod "0bbe2304-240f-4e4e-8d89-5d40d3017568" (UID: "0bbe2304-240f-4e4e-8d89-5d40d3017568"). InnerVolumeSpecName "kube-api-access-lztrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.312956 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bbe2304-240f-4e4e-8d89-5d40d3017568" (UID: "0bbe2304-240f-4e4e-8d89-5d40d3017568"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.390120 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.390187 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lztrf\" (UniqueName: \"kubernetes.io/projected/0bbe2304-240f-4e4e-8d89-5d40d3017568-kube-api-access-lztrf\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:14 crc kubenswrapper[4903]: I1202 23:01:14.390206 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bbe2304-240f-4e4e-8d89-5d40d3017568-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:15 crc kubenswrapper[4903]: I1202 23:01:15.182726 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9h5w" Dec 02 23:01:15 crc kubenswrapper[4903]: I1202 23:01:15.214614 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:01:15 crc kubenswrapper[4903]: I1202 23:01:15.218912 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9h5w"] Dec 02 23:01:15 crc kubenswrapper[4903]: I1202 23:01:15.625549 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" path="/var/lib/kubelet/pods/0bbe2304-240f-4e4e-8d89-5d40d3017568/volumes" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.471942 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.800532 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.801170 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.860857 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.983983 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:01:16 crc kubenswrapper[4903]: I1202 23:01:16.984055 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.031774 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.232911 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.233496 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.262131 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.263137 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:17 crc kubenswrapper[4903]: I1202 23:01:17.317186 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:18 crc kubenswrapper[4903]: I1202 23:01:18.270607 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:18 crc kubenswrapper[4903]: I1202 23:01:18.790397 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:01:20 crc kubenswrapper[4903]: I1202 23:01:20.048766 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:20 crc kubenswrapper[4903]: I1202 23:01:20.115717 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:20 crc kubenswrapper[4903]: I1202 23:01:20.189695 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:01:20 crc kubenswrapper[4903]: I1202 23:01:20.216074 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqwqz" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="registry-server" containerID="cri-o://fd8762c71e1620bf875e218fca6a6a386f627398136b1c8914415946b5b8810f" gracePeriod=2 Dec 02 23:01:21 crc kubenswrapper[4903]: I1202 23:01:21.223908 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhwns" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="registry-server" containerID="cri-o://044fc400eaf6f8c4eb28800440500fa7cd23294a7660de2f27f52fd20759fb0f" gracePeriod=2 Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.069913 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.070008 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.070074 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.070896 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.071098 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757" gracePeriod=600 Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.233372 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:01:23 crc kubenswrapper[4903]: I1202 23:01:23.233675 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tqg9j" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="registry-server" containerID="cri-o://17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13" gracePeriod=2 Dec 02 23:01:23 crc kubenswrapper[4903]: E1202 23:01:23.673291 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a3d9b9_b736_4927_a6b2_9e5f25faf956.slice/crio-conmon-17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.130016 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" podUID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" containerName="oauth-openshift" containerID="cri-o://f392956756d3add487eaa6413a6e189916b1799a5222f7d3f771d2c5d96654eb" gracePeriod=15 Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.297277 4903 generic.go:334] "Generic (PLEG): container finished" podID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerID="17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13" exitCode=0 Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.297354 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerDied","Data":"17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13"} Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.300229 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757" exitCode=0 Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.300306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757"} Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.302738 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gqwqz_cb4f1b53-4bbe-4713-852b-066b5d3fd40c/registry-server/0.log" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.304175 4903 generic.go:334] "Generic (PLEG): container finished" podID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerID="fd8762c71e1620bf875e218fca6a6a386f627398136b1c8914415946b5b8810f" exitCode=137 Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.304268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerDied","Data":"fd8762c71e1620bf875e218fca6a6a386f627398136b1c8914415946b5b8810f"} Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.305778 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hhwns_ca1fbb27-3875-43c7-8013-f22c6112be2b/registry-server/0.log" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.306275 4903 generic.go:334] "Generic (PLEG): container finished" podID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerID="044fc400eaf6f8c4eb28800440500fa7cd23294a7660de2f27f52fd20759fb0f" exitCode=137 Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.306311 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerDied","Data":"044fc400eaf6f8c4eb28800440500fa7cd23294a7660de2f27f52fd20759fb0f"} Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.509203 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hhwns_ca1fbb27-3875-43c7-8013-f22c6112be2b/registry-server/0.log" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.510881 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.643521 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gqwqz_cb4f1b53-4bbe-4713-852b-066b5d3fd40c/registry-server/0.log" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.643753 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content\") pod \"ca1fbb27-3875-43c7-8013-f22c6112be2b\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.644106 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw225\" (UniqueName: \"kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225\") pod \"ca1fbb27-3875-43c7-8013-f22c6112be2b\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.644281 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities\") pod \"ca1fbb27-3875-43c7-8013-f22c6112be2b\" (UID: \"ca1fbb27-3875-43c7-8013-f22c6112be2b\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.649076 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.650022 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities" (OuterVolumeSpecName: "utilities") pod "ca1fbb27-3875-43c7-8013-f22c6112be2b" (UID: "ca1fbb27-3875-43c7-8013-f22c6112be2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.654816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225" (OuterVolumeSpecName: "kube-api-access-bw225") pod "ca1fbb27-3875-43c7-8013-f22c6112be2b" (UID: "ca1fbb27-3875-43c7-8013-f22c6112be2b"). InnerVolumeSpecName "kube-api-access-bw225". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.665079 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.706557 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca1fbb27-3875-43c7-8013-f22c6112be2b" (UID: "ca1fbb27-3875-43c7-8013-f22c6112be2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.745858 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities\") pod \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.746194 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content\") pod \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.746548 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjswr\" (UniqueName: \"kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr\") pod \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\" (UID: \"cb4f1b53-4bbe-4713-852b-066b5d3fd40c\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.746996 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities" (OuterVolumeSpecName: "utilities") pod "cb4f1b53-4bbe-4713-852b-066b5d3fd40c" (UID: "cb4f1b53-4bbe-4713-852b-066b5d3fd40c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.747230 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.747251 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw225\" (UniqueName: \"kubernetes.io/projected/ca1fbb27-3875-43c7-8013-f22c6112be2b-kube-api-access-bw225\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.747263 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.747273 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1fbb27-3875-43c7-8013-f22c6112be2b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.750377 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr" (OuterVolumeSpecName: "kube-api-access-qjswr") pod "cb4f1b53-4bbe-4713-852b-066b5d3fd40c" (UID: "cb4f1b53-4bbe-4713-852b-066b5d3fd40c"). InnerVolumeSpecName "kube-api-access-qjswr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.806262 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb4f1b53-4bbe-4713-852b-066b5d3fd40c" (UID: "cb4f1b53-4bbe-4713-852b-066b5d3fd40c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.848594 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content\") pod \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.848725 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities\") pod \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.848808 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8flqn\" (UniqueName: \"kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn\") pod \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\" (UID: \"c0a3d9b9-b736-4927-a6b2-9e5f25faf956\") " Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.849070 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjswr\" (UniqueName: \"kubernetes.io/projected/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-kube-api-access-qjswr\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.849095 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f1b53-4bbe-4713-852b-066b5d3fd40c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.850615 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities" (OuterVolumeSpecName: "utilities") pod "c0a3d9b9-b736-4927-a6b2-9e5f25faf956" (UID: "c0a3d9b9-b736-4927-a6b2-9e5f25faf956"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.853033 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn" (OuterVolumeSpecName: "kube-api-access-8flqn") pod "c0a3d9b9-b736-4927-a6b2-9e5f25faf956" (UID: "c0a3d9b9-b736-4927-a6b2-9e5f25faf956"). InnerVolumeSpecName "kube-api-access-8flqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.950470 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.950524 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8flqn\" (UniqueName: \"kubernetes.io/projected/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-kube-api-access-8flqn\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:24 crc kubenswrapper[4903]: I1202 23:01:24.977935 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0a3d9b9-b736-4927-a6b2-9e5f25faf956" (UID: "c0a3d9b9-b736-4927-a6b2-9e5f25faf956"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.052018 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a3d9b9-b736-4927-a6b2-9e5f25faf956-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.314923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqg9j" event={"ID":"c0a3d9b9-b736-4927-a6b2-9e5f25faf956","Type":"ContainerDied","Data":"99040a33ced1127e1c817e016baa230493ba36065e27e6c627f4bcab15d90735"} Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.315003 4903 scope.go:117] "RemoveContainer" containerID="17085c7dc723b7052e986be871feea0d8da704e9b61bffb741a4b1f546785c13" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.314931 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqg9j" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.319268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac"} Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.335983 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gqwqz_cb4f1b53-4bbe-4713-852b-066b5d3fd40c/registry-server/0.log" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.336973 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqwqz" event={"ID":"cb4f1b53-4bbe-4713-852b-066b5d3fd40c","Type":"ContainerDied","Data":"e293a80b570e1a96c355d73a1b5c8fd93ff5555b320059f5be30c403a8e5173d"} Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.337093 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqwqz" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.341006 4903 scope.go:117] "RemoveContainer" containerID="b1d5bee7e5d0b2883b4b7d3cf61bdc519d64b7d6a5c342f6acc1505a22b2350a" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.344469 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hhwns_ca1fbb27-3875-43c7-8013-f22c6112be2b/registry-server/0.log" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.349975 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhwns" event={"ID":"ca1fbb27-3875-43c7-8013-f22c6112be2b","Type":"ContainerDied","Data":"0d49bb05c93e759460f98a10bd44ce5262426538c276dd13a4f21d2aabb1cbf6"} Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.350167 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhwns" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.363317 4903 generic.go:334] "Generic (PLEG): container finished" podID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" containerID="f392956756d3add487eaa6413a6e189916b1799a5222f7d3f771d2c5d96654eb" exitCode=0 Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.363417 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" event={"ID":"8c5aaf2f-377b-4d66-b9b6-671f831e0af1","Type":"ContainerDied","Data":"f392956756d3add487eaa6413a6e189916b1799a5222f7d3f771d2c5d96654eb"} Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.384781 4903 scope.go:117] "RemoveContainer" containerID="ebe4e22654cf88f759281dd17ed3061ffb0aeda4f43bf865292f6a86b965b83b" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.459724 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.461360 4903 scope.go:117] "RemoveContainer" containerID="fd8762c71e1620bf875e218fca6a6a386f627398136b1c8914415946b5b8810f" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.467042 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tqg9j"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.471922 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.488888 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqwqz"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.498965 4903 scope.go:117] "RemoveContainer" containerID="6323b3f251eb6d6f74e1770bcaefc7650abdeaee439260ce0eddccf2f3504f71" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.501527 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.519115 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhwns"] Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.525366 4903 scope.go:117] "RemoveContainer" containerID="7dd0ad852920d6338920518e744a0f754fb57810ac79784f4d5e5c4fc324b8ef" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.542595 4903 scope.go:117] "RemoveContainer" containerID="044fc400eaf6f8c4eb28800440500fa7cd23294a7660de2f27f52fd20759fb0f" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.568690 4903 scope.go:117] "RemoveContainer" containerID="05d9d15e643a8ca27044cf4664efe9878fd0f32b30aaa6f4f92a9974efbbda3b" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.579792 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.585890 4903 scope.go:117] "RemoveContainer" containerID="c15e422100636cd0a6bb6d483fc22d66581a58714ed509d29a242511efb87046" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.620621 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" path="/var/lib/kubelet/pods/c0a3d9b9-b736-4927-a6b2-9e5f25faf956/volumes" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.621496 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" path="/var/lib/kubelet/pods/ca1fbb27-3875-43c7-8013-f22c6112be2b/volumes" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.622051 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" path="/var/lib/kubelet/pods/cb4f1b53-4bbe-4713-852b-066b5d3fd40c/volumes" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763213 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763266 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763290 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763329 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763351 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763377 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763421 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763441 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763473 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763502 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763541 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763586 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763608 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5rt5\" (UniqueName: \"kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.763642 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca\") pod \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\" (UID: \"8c5aaf2f-377b-4d66-b9b6-671f831e0af1\") " Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.764184 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.764248 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.764312 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.764336 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.764346 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.769105 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.769427 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.769560 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5" (OuterVolumeSpecName: "kube-api-access-f5rt5") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "kube-api-access-f5rt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.769580 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.769877 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.770420 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.774178 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.775203 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.776003 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8c5aaf2f-377b-4d66-b9b6-671f831e0af1" (UID: "8c5aaf2f-377b-4d66-b9b6-671f831e0af1"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865560 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865621 4903 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865644 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865687 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865708 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865727 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865744 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5rt5\" (UniqueName: \"kubernetes.io/projected/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-kube-api-access-f5rt5\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865765 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865784 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865801 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865822 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865839 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865857 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:25 crc kubenswrapper[4903]: I1202 23:01:25.865877 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8c5aaf2f-377b-4d66-b9b6-671f831e0af1-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:26 crc kubenswrapper[4903]: I1202 23:01:26.376307 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" event={"ID":"8c5aaf2f-377b-4d66-b9b6-671f831e0af1","Type":"ContainerDied","Data":"af14b9d2f0819e8119454193efda106b0e52add0cbdbb4e93bf86a92c007f4c2"} Dec 02 23:01:26 crc kubenswrapper[4903]: I1202 23:01:26.376795 4903 scope.go:117] "RemoveContainer" containerID="f392956756d3add487eaa6413a6e189916b1799a5222f7d3f771d2c5d96654eb" Dec 02 23:01:26 crc kubenswrapper[4903]: I1202 23:01:26.376889 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5ltff" Dec 02 23:01:26 crc kubenswrapper[4903]: I1202 23:01:26.410136 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 23:01:26 crc kubenswrapper[4903]: I1202 23:01:26.416720 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5ltff"] Dec 02 23:01:27 crc kubenswrapper[4903]: I1202 23:01:27.620457 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" path="/var/lib/kubelet/pods/8c5aaf2f-377b-4d66-b9b6-671f831e0af1/volumes" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.213812 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-c78d49fb6-nt4zq"] Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214044 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214057 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214070 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214078 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214089 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214097 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214113 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214120 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214131 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214141 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214152 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" containerName="oauth-openshift" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214160 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" containerName="oauth-openshift" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214171 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214179 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214189 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214197 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214209 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214218 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="extract-utilities" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214230 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214238 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214249 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214259 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214272 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214279 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="extract-content" Dec 02 23:01:29 crc kubenswrapper[4903]: E1202 23:01:29.214291 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214299 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214408 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5aaf2f-377b-4d66-b9b6-671f831e0af1" containerName="oauth-openshift" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214424 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbe2304-240f-4e4e-8d89-5d40d3017568" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214436 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4f1b53-4bbe-4713-852b-066b5d3fd40c" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214447 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1fbb27-3875-43c7-8013-f22c6112be2b" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214464 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a3d9b9-b736-4927-a6b2-9e5f25faf956" containerName="registry-server" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.214917 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.217946 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b94r\" (UniqueName: \"kubernetes.io/projected/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-kube-api-access-2b94r\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218050 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-session\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218088 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-service-ca\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218135 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218196 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-login\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218258 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-policies\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218295 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218375 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218409 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-error\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218457 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218513 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218582 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-dir\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.218630 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-router-certs\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.219199 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.219326 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.219871 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220028 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220090 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220163 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220031 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220426 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220476 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.220908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.221110 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.221494 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.234555 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.237443 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.238644 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c78d49fb6-nt4zq"] Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.248861 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-login\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320124 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-policies\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320174 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320216 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320261 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320296 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-error\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320339 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320377 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320438 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-dir\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320484 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-router-certs\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b94r\" (UniqueName: \"kubernetes.io/projected/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-kube-api-access-2b94r\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320572 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-session\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.320603 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-service-ca\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.321521 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-policies\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.322162 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.322266 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-audit-dir\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.322794 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-service-ca\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.324292 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.327700 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.327718 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-error\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.328378 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-session\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.329206 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.329537 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.330075 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.330097 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-system-router-certs\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.330426 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-v4-0-config-user-template-login\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.349529 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b94r\" (UniqueName: \"kubernetes.io/projected/ef689aa6-d232-4af0-9305-e6b6b5aa8ebd-kube-api-access-2b94r\") pod \"oauth-openshift-c78d49fb6-nt4zq\" (UID: \"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd\") " pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.547099 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:29 crc kubenswrapper[4903]: I1202 23:01:29.818457 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c78d49fb6-nt4zq"] Dec 02 23:01:29 crc kubenswrapper[4903]: W1202 23:01:29.831750 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef689aa6_d232_4af0_9305_e6b6b5aa8ebd.slice/crio-88347520d098f572ed0db29447405ca1e581c796329c6a0211f0d0b223adb6f2 WatchSource:0}: Error finding container 88347520d098f572ed0db29447405ca1e581c796329c6a0211f0d0b223adb6f2: Status 404 returned error can't find the container with id 88347520d098f572ed0db29447405ca1e581c796329c6a0211f0d0b223adb6f2 Dec 02 23:01:30 crc kubenswrapper[4903]: I1202 23:01:30.415300 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" event={"ID":"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd","Type":"ContainerStarted","Data":"4bb13616e329eee55f9f42748fab3781e02090946ba3d02adb3caf7c1cbb566b"} Dec 02 23:01:30 crc kubenswrapper[4903]: I1202 23:01:30.415379 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" event={"ID":"ef689aa6-d232-4af0-9305-e6b6b5aa8ebd","Type":"ContainerStarted","Data":"88347520d098f572ed0db29447405ca1e581c796329c6a0211f0d0b223adb6f2"} Dec 02 23:01:30 crc kubenswrapper[4903]: I1202 23:01:30.416197 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:30 crc kubenswrapper[4903]: I1202 23:01:30.453952 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" podStartSLOduration=31.45392166 podStartE2EDuration="31.45392166s" podCreationTimestamp="2025-12-02 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:01:30.446688955 +0000 UTC m=+229.155243278" watchObservedRunningTime="2025-12-02 23:01:30.45392166 +0000 UTC m=+229.162475983" Dec 02 23:01:30 crc kubenswrapper[4903]: I1202 23:01:30.456648 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-c78d49fb6-nt4zq" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.670789 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.672798 4903 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.673041 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.673373 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692" gracePeriod=15 Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.674076 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e" gracePeriod=15 Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.674135 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35" gracePeriod=15 Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.674163 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e" gracePeriod=15 Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.674112 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c" gracePeriod=15 Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.674929 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.675294 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.675315 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.675341 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.675353 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.675370 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.676622 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.676685 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.676701 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.676743 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.677768 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.677844 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.677863 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.677918 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.677932 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.678467 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.678496 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.678523 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.678547 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.678583 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.680805 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.812934 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.813223 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.813340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.814860 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.814949 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.815346 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.815756 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.815826 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: E1202 23:01:33.856801 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poda7bed574_70f1_4907_be3c_6cd655b3df0a.slice/crio-conmon-3fdce0e356f2da192517d70482023eb0e6c6852d29241c96898bb277c07fb2a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-poda7bed574_70f1_4907_be3c_6cd655b3df0a.slice/crio-3fdce0e356f2da192517d70482023eb0e6c6852d29241c96898bb277c07fb2a4.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917250 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917431 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917462 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917489 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917521 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917561 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917619 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917716 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917748 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917793 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917793 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917827 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917906 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:33 crc kubenswrapper[4903]: I1202 23:01:33.917908 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.444824 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7bed574-70f1-4907-be3c-6cd655b3df0a" containerID="3fdce0e356f2da192517d70482023eb0e6c6852d29241c96898bb277c07fb2a4" exitCode=0 Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.444939 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a7bed574-70f1-4907-be3c-6cd655b3df0a","Type":"ContainerDied","Data":"3fdce0e356f2da192517d70482023eb0e6c6852d29241c96898bb277c07fb2a4"} Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.446296 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.447025 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.448900 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.451306 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.452574 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e" exitCode=0 Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.452615 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35" exitCode=0 Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.452630 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e" exitCode=2 Dec 02 23:01:34 crc kubenswrapper[4903]: I1202 23:01:34.452728 4903 scope.go:117] "RemoveContainer" containerID="8b4133a84a1a700532aecc8a8c05fdbdc116dc09e2d76cb36846c3283357d168" Dec 02 23:01:35 crc kubenswrapper[4903]: I1202 23:01:35.467163 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.604239 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.604798 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.605317 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.605649 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.606258 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:35 crc kubenswrapper[4903]: I1202 23:01:35.606336 4903 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.607043 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Dec 02 23:01:35 crc kubenswrapper[4903]: E1202 23:01:35.807406 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Dec 02 23:01:35 crc kubenswrapper[4903]: I1202 23:01:35.888261 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:01:35 crc kubenswrapper[4903]: I1202 23:01:35.889177 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.053403 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock\") pod \"a7bed574-70f1-4907-be3c-6cd655b3df0a\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.053796 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock" (OuterVolumeSpecName: "var-lock") pod "a7bed574-70f1-4907-be3c-6cd655b3df0a" (UID: "a7bed574-70f1-4907-be3c-6cd655b3df0a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.053870 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access\") pod \"a7bed574-70f1-4907-be3c-6cd655b3df0a\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.053930 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir\") pod \"a7bed574-70f1-4907-be3c-6cd655b3df0a\" (UID: \"a7bed574-70f1-4907-be3c-6cd655b3df0a\") " Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.054138 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7bed574-70f1-4907-be3c-6cd655b3df0a" (UID: "a7bed574-70f1-4907-be3c-6cd655b3df0a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.054409 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.054446 4903 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7bed574-70f1-4907-be3c-6cd655b3df0a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.062425 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7bed574-70f1-4907-be3c-6cd655b3df0a" (UID: "a7bed574-70f1-4907-be3c-6cd655b3df0a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.156120 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bed574-70f1-4907-be3c-6cd655b3df0a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:36 crc kubenswrapper[4903]: E1202 23:01:36.209520 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.481878 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.483121 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692" exitCode=0 Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.485413 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a7bed574-70f1-4907-be3c-6cd655b3df0a","Type":"ContainerDied","Data":"1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829"} Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.485473 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.485516 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e61ce592e4335d1026e5acbbb14ce8b528df9dbb0a1791e99405e08baa07829" Dec 02 23:01:36 crc kubenswrapper[4903]: I1202 23:01:36.511540 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:37 crc kubenswrapper[4903]: E1202 23:01:37.014802 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Dec 02 23:01:38 crc kubenswrapper[4903]: E1202 23:01:38.615445 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Dec 02 23:01:38 crc kubenswrapper[4903]: E1202 23:01:38.739371 4903 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:38 crc kubenswrapper[4903]: I1202 23:01:38.740086 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:38 crc kubenswrapper[4903]: E1202 23:01:38.818156 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d884e4949335a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 23:01:38.81746313 +0000 UTC m=+237.526017453,LastTimestamp:2025-12-02 23:01:38.81746313 +0000 UTC m=+237.526017453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 23:01:38 crc kubenswrapper[4903]: I1202 23:01:38.918962 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 23:01:38 crc kubenswrapper[4903]: I1202 23:01:38.920082 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:38 crc kubenswrapper[4903]: I1202 23:01:38.920588 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:38 crc kubenswrapper[4903]: I1202 23:01:38.920877 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.101507 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.101841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.101945 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.101594 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.102252 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.102161 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.203020 4903 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.203243 4903 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.203605 4903 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.505416 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454"} Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.505525 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"38ca3fb953c529239ad854ad2d3e4254f97119043d876d319398b228826c5565"} Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.507047 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.507104 4903 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.507589 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.510222 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.511222 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c" exitCode=0 Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.511262 4903 scope.go:117] "RemoveContainer" containerID="49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.511456 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.533926 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.534490 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.537250 4903 scope.go:117] "RemoveContainer" containerID="05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.564101 4903 scope.go:117] "RemoveContainer" containerID="779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.587011 4903 scope.go:117] "RemoveContainer" containerID="d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.611901 4903 scope.go:117] "RemoveContainer" containerID="77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.625211 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.633324 4903 scope.go:117] "RemoveContainer" containerID="f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.665140 4903 scope.go:117] "RemoveContainer" containerID="49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.665800 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\": container with ID starting with 49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e not found: ID does not exist" containerID="49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.665851 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e"} err="failed to get container status \"49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\": rpc error: code = NotFound desc = could not find container \"49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e\": container with ID starting with 49ce32f29e66aabce74397198884490f5903af6d72bf6be1b6feca56b234fc1e not found: ID does not exist" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.665886 4903 scope.go:117] "RemoveContainer" containerID="05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.666242 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\": container with ID starting with 05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35 not found: ID does not exist" containerID="05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.666279 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35"} err="failed to get container status \"05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\": rpc error: code = NotFound desc = could not find container \"05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35\": container with ID starting with 05a2822269664fd4891ff6bdf2a529768e61009d20b6f2b55d7c2c250350ba35 not found: ID does not exist" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.666305 4903 scope.go:117] "RemoveContainer" containerID="779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.666541 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\": container with ID starting with 779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c not found: ID does not exist" containerID="779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.666582 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c"} err="failed to get container status \"779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\": rpc error: code = NotFound desc = could not find container \"779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c\": container with ID starting with 779804c277e2c0661b3a0e5b54dba2b922816ec30be86748eb56c5ac2b9da70c not found: ID does not exist" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.666608 4903 scope.go:117] "RemoveContainer" containerID="d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.667141 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\": container with ID starting with d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e not found: ID does not exist" containerID="d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.667206 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e"} err="failed to get container status \"d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\": rpc error: code = NotFound desc = could not find container \"d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e\": container with ID starting with d09b1dde0ea23ea9d4da2baa43d80c2928fdb5d6e21397cdc3095746bff28f5e not found: ID does not exist" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.667250 4903 scope.go:117] "RemoveContainer" containerID="77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.668563 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\": container with ID starting with 77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692 not found: ID does not exist" containerID="77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.668609 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692"} err="failed to get container status \"77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\": rpc error: code = NotFound desc = could not find container \"77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692\": container with ID starting with 77e53d02c9c400eca51d909274d10a03dcfb248b28ac2fad3effe89bd72cb692 not found: ID does not exist" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.668641 4903 scope.go:117] "RemoveContainer" containerID="f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd" Dec 02 23:01:39 crc kubenswrapper[4903]: E1202 23:01:39.669091 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\": container with ID starting with f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd not found: ID does not exist" containerID="f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd" Dec 02 23:01:39 crc kubenswrapper[4903]: I1202 23:01:39.669168 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd"} err="failed to get container status \"f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\": rpc error: code = NotFound desc = could not find container \"f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd\": container with ID starting with f40b24c0db9baffd09a5a530390ccc831c973ad41d332e6856c6def88c2761fd not found: ID does not exist" Dec 02 23:01:40 crc kubenswrapper[4903]: E1202 23:01:40.959501 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d884e4949335a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 23:01:38.81746313 +0000 UTC m=+237.526017453,LastTimestamp:2025-12-02 23:01:38.81746313 +0000 UTC m=+237.526017453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 23:01:41 crc kubenswrapper[4903]: I1202 23:01:41.618403 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:41 crc kubenswrapper[4903]: E1202 23:01:41.816983 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="6.4s" Dec 02 23:01:44 crc kubenswrapper[4903]: I1202 23:01:44.611862 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:44 crc kubenswrapper[4903]: I1202 23:01:44.613501 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:44 crc kubenswrapper[4903]: I1202 23:01:44.636920 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:44 crc kubenswrapper[4903]: I1202 23:01:44.636978 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:44 crc kubenswrapper[4903]: E1202 23:01:44.637439 4903 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:44 crc kubenswrapper[4903]: I1202 23:01:44.638125 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:44 crc kubenswrapper[4903]: W1202 23:01:44.661963 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e0c7bbf91c3cf258bf14ac98cd0ebe700790aaf986e942a0c30e7b630eab2ac1 WatchSource:0}: Error finding container e0c7bbf91c3cf258bf14ac98cd0ebe700790aaf986e942a0c30e7b630eab2ac1: Status 404 returned error can't find the container with id e0c7bbf91c3cf258bf14ac98cd0ebe700790aaf986e942a0c30e7b630eab2ac1 Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.556949 4903 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3dcf0385e4d14bd9a2221ae1f3955e7e9e130a1c83030d97973cf5172f519d28" exitCode=0 Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.557009 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3dcf0385e4d14bd9a2221ae1f3955e7e9e130a1c83030d97973cf5172f519d28"} Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.557085 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e0c7bbf91c3cf258bf14ac98cd0ebe700790aaf986e942a0c30e7b630eab2ac1"} Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.557479 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.557512 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:45 crc kubenswrapper[4903]: I1202 23:01:45.558127 4903 status_manager.go:851] "Failed to get status for pod" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Dec 02 23:01:45 crc kubenswrapper[4903]: E1202 23:01:45.558163 4903 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.567123 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.567458 4903 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c" exitCode=1 Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.567486 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c"} Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.568707 4903 scope.go:117] "RemoveContainer" containerID="432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c" Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.571419 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a3998ab10f1226e0600db2cce2df95be4ce57599cdcb879f12fee9aa82afb7c"} Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.571446 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb8319853381d2fd83cfc335d9dcbf56152b02a784a6f468894fab696d8c6b4a"} Dec 02 23:01:46 crc kubenswrapper[4903]: I1202 23:01:46.571460 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"061d8a3c7f57e0d6df38b84059ad9cbaedd875e103f0ca70db42e50489ee5a01"} Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.579748 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.580496 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e890fa7ca48e622bd87ab8049fbfdc438150f431cedcf3f0bdf76e59c028b575"} Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.583599 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ab651b2c0bf374ec0ab6b9f27e216753a75bb08b596ffe7537621bcdfa0619f"} Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.583781 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1512b633fd937112e790cf9947185c69b2d5ac312ec5f650edb445465c6539c6"} Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.583889 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.583908 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:47 crc kubenswrapper[4903]: I1202 23:01:47.584082 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:49 crc kubenswrapper[4903]: I1202 23:01:49.639193 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:49 crc kubenswrapper[4903]: I1202 23:01:49.639812 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:49 crc kubenswrapper[4903]: I1202 23:01:49.646230 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:52 crc kubenswrapper[4903]: I1202 23:01:52.591137 4903 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:53 crc kubenswrapper[4903]: I1202 23:01:53.635586 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:53 crc kubenswrapper[4903]: I1202 23:01:53.636022 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:53 crc kubenswrapper[4903]: I1202 23:01:53.643689 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:01:53 crc kubenswrapper[4903]: I1202 23:01:53.647867 4903 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dba911d2-2852-4303-8890-3c985bcc786f" Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.247113 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.247831 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.247938 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.302608 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.642827 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:54 crc kubenswrapper[4903]: I1202 23:01:54.642858 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2983aeb0-e38e-4be7-88d4-2a2fc720f014" Dec 02 23:01:58 crc kubenswrapper[4903]: I1202 23:01:58.985049 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 23:01:59 crc kubenswrapper[4903]: I1202 23:01:59.693700 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 23:02:00 crc kubenswrapper[4903]: I1202 23:02:00.078906 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 23:02:01 crc kubenswrapper[4903]: I1202 23:02:01.631447 4903 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dba911d2-2852-4303-8890-3c985bcc786f" Dec 02 23:02:02 crc kubenswrapper[4903]: I1202 23:02:02.299813 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 23:02:02 crc kubenswrapper[4903]: I1202 23:02:02.391366 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 23:02:02 crc kubenswrapper[4903]: I1202 23:02:02.756096 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 23:02:03 crc kubenswrapper[4903]: I1202 23:02:03.550591 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 23:02:03 crc kubenswrapper[4903]: I1202 23:02:03.932389 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 23:02:03 crc kubenswrapper[4903]: I1202 23:02:03.962889 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 23:02:04 crc kubenswrapper[4903]: I1202 23:02:04.059549 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 23:02:04 crc kubenswrapper[4903]: I1202 23:02:04.247573 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 23:02:04 crc kubenswrapper[4903]: I1202 23:02:04.247684 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 23:02:04 crc kubenswrapper[4903]: I1202 23:02:04.608336 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 23:02:04 crc kubenswrapper[4903]: I1202 23:02:04.761695 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.139729 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.313695 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.396689 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.771594 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.875620 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 23:02:05 crc kubenswrapper[4903]: I1202 23:02:05.974962 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.110532 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.325693 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.612411 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.620216 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.648036 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.875227 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 23:02:06 crc kubenswrapper[4903]: I1202 23:02:06.970794 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.046168 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.062489 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.128750 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.326124 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.368742 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.583601 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.591826 4903 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.792546 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 23:02:07 crc kubenswrapper[4903]: I1202 23:02:07.980293 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.002027 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.035939 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.069403 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.089647 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.192298 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.227892 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.240629 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.281188 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.381618 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.394031 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.432774 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.466761 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.522735 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.529769 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.560949 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.572882 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.614733 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.652865 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.721055 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.778863 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.804373 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.862598 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 23:02:08 crc kubenswrapper[4903]: I1202 23:02:08.863251 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.033885 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.187128 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.214675 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.285183 4903 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.311196 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.437511 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.456048 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.519089 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.573000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.591284 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.648857 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.655480 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.670376 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.751926 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.755924 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.819642 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.855270 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.898310 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.978924 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 23:02:09 crc kubenswrapper[4903]: I1202 23:02:09.990925 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.004835 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.058212 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.072051 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.079451 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.093635 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.095464 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.160267 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.293550 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.426115 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.474775 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.485747 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.501923 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.544912 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.549089 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.584006 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.607679 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.635672 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.640635 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.642565 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.673332 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.736479 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.785819 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.825181 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.832371 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.832990 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.867571 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 23:02:10 crc kubenswrapper[4903]: I1202 23:02:10.983458 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.023629 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.039747 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.059376 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.222996 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.232762 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.307589 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.322964 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.428866 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.439793 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.450739 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.547439 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.574775 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.631368 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.714317 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.731524 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.877664 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.910729 4903 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 23:02:11 crc kubenswrapper[4903]: I1202 23:02:11.924405 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.008212 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.025146 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.052915 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.065243 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.074400 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.101877 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.121145 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.124121 4903 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.127568 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.127615 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.131367 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.151470 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.152974 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.15293913 podStartE2EDuration="20.15293913s" podCreationTimestamp="2025-12-02 23:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:02:12.14520388 +0000 UTC m=+270.853758163" watchObservedRunningTime="2025-12-02 23:02:12.15293913 +0000 UTC m=+270.861493443" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.190260 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.221856 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.233848 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.267639 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.294774 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.410182 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.456725 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.473180 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.528985 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.561059 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.747376 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.753226 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.865955 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.912148 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.939788 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 23:02:12 crc kubenswrapper[4903]: I1202 23:02:12.941817 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.038874 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.114243 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.159810 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.321631 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.334630 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.400492 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.468407 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.472434 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.511298 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.552161 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.560274 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.594346 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.679147 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.836402 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.847435 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.886819 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.920170 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.957447 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 23:02:13 crc kubenswrapper[4903]: I1202 23:02:13.972110 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.007799 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.013754 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.206864 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.248402 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.248479 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.248536 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.249239 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"e890fa7ca48e622bd87ab8049fbfdc438150f431cedcf3f0bdf76e59c028b575"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.249383 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://e890fa7ca48e622bd87ab8049fbfdc438150f431cedcf3f0bdf76e59c028b575" gracePeriod=30 Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.300809 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.380967 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.496069 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.503513 4903 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.523270 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.524258 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.525159 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.535425 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.572980 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.599887 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.706865 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.901485 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.924321 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.928704 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 23:02:14 crc kubenswrapper[4903]: I1202 23:02:14.957535 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.056628 4903 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.058582 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454" gracePeriod=5 Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.245035 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.458952 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.519624 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.547200 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.678535 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.764232 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.782593 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.796218 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 23:02:15 crc kubenswrapper[4903]: I1202 23:02:15.806163 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.025979 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.032901 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.053661 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.152781 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.181844 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.198399 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.241296 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.322508 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.491409 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.575702 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.622765 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.690239 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.704821 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.877375 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 23:02:16 crc kubenswrapper[4903]: I1202 23:02:16.934385 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.090953 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.125877 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.177207 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.287556 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.348129 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.376069 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.562033 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.649769 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.702846 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.827854 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.954804 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 23:02:17 crc kubenswrapper[4903]: I1202 23:02:17.983358 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.089056 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.119222 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.229522 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.423642 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.566888 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.592927 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.632960 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.723529 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.804924 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.807160 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 23:02:18 crc kubenswrapper[4903]: I1202 23:02:18.870249 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.066144 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.089865 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.113215 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.204398 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.218884 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.240978 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.259947 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.325435 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.376927 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.513723 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.705188 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.842885 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 23:02:19 crc kubenswrapper[4903]: I1202 23:02:19.885533 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.008495 4903 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.022149 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.194987 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.613483 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.628508 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.628587 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.694772 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.804679 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.804776 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.804803 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.804835 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.804869 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.805468 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.805765 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.806392 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.806444 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.811794 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.811850 4903 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454" exitCode=137 Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.811903 4903 scope.go:117] "RemoveContainer" containerID="c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.812076 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.814311 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.815795 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.851145 4903 scope.go:117] "RemoveContainer" containerID="c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454" Dec 02 23:02:20 crc kubenswrapper[4903]: E1202 23:02:20.852352 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454\": container with ID starting with c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454 not found: ID does not exist" containerID="c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.852393 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454"} err="failed to get container status \"c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454\": rpc error: code = NotFound desc = could not find container \"c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454\": container with ID starting with c94c8f5268b1653b7f89f1680ee352cb6188d33a8a5ad760cbc6ef2e6f6b4454 not found: ID does not exist" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.857160 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.906401 4903 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.906745 4903 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.906760 4903 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.906772 4903 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:20 crc kubenswrapper[4903]: I1202 23:02:20.906782 4903 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 23:02:21 crc kubenswrapper[4903]: I1202 23:02:21.129132 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 23:02:21 crc kubenswrapper[4903]: I1202 23:02:21.629575 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 23:02:41 crc kubenswrapper[4903]: I1202 23:02:41.459611 4903 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 02 23:02:44 crc kubenswrapper[4903]: I1202 23:02:44.970088 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 23:02:44 crc kubenswrapper[4903]: I1202 23:02:44.972035 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 23:02:44 crc kubenswrapper[4903]: I1202 23:02:44.972091 4903 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e890fa7ca48e622bd87ab8049fbfdc438150f431cedcf3f0bdf76e59c028b575" exitCode=137 Dec 02 23:02:44 crc kubenswrapper[4903]: I1202 23:02:44.972119 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e890fa7ca48e622bd87ab8049fbfdc438150f431cedcf3f0bdf76e59c028b575"} Dec 02 23:02:44 crc kubenswrapper[4903]: I1202 23:02:44.972150 4903 scope.go:117] "RemoveContainer" containerID="432f64f2f72aa0929037f132af615f06e35c6133e851c404ad3407d6d63b941c" Dec 02 23:02:45 crc kubenswrapper[4903]: I1202 23:02:45.980182 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 23:02:45 crc kubenswrapper[4903]: I1202 23:02:45.982691 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a37aba8e1aa480eafbd4c59a7bff7c74a45df2388561aebad56850b0c0b14fec"} Dec 02 23:02:54 crc kubenswrapper[4903]: I1202 23:02:54.247528 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:02:54 crc kubenswrapper[4903]: I1202 23:02:54.255153 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:02:54 crc kubenswrapper[4903]: I1202 23:02:54.303108 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:02:55 crc kubenswrapper[4903]: I1202 23:02:55.036434 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.697095 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.697591 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krzz4" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="registry-server" containerID="cri-o://965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43" gracePeriod=30 Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.702103 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.702342 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v4mbf" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="registry-server" containerID="cri-o://402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1" gracePeriod=30 Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.712498 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.712732 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" containerID="cri-o://425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4" gracePeriod=30 Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.720664 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.720897 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4ft5" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="registry-server" containerID="cri-o://6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd" gracePeriod=30 Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.733562 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:03:00 crc kubenswrapper[4903]: I1202 23:03:00.733827 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8bdg" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="registry-server" containerID="cri-o://64a4e1096c14f023200f333616062d7e44375a2823d7b9bd32202fe9f928d04c" gracePeriod=30 Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.564700 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.629862 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities\") pod \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.629928 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kflkk\" (UniqueName: \"kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk\") pod \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.630070 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content\") pod \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\" (UID: \"80fb141b-48f4-4f70-afd8-78fdb7b3c20c\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.632907 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities" (OuterVolumeSpecName: "utilities") pod "80fb141b-48f4-4f70-afd8-78fdb7b3c20c" (UID: "80fb141b-48f4-4f70-afd8-78fdb7b3c20c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.657133 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk" (OuterVolumeSpecName: "kube-api-access-kflkk") pod "80fb141b-48f4-4f70-afd8-78fdb7b3c20c" (UID: "80fb141b-48f4-4f70-afd8-78fdb7b3c20c"). InnerVolumeSpecName "kube-api-access-kflkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.698737 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.703974 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.718342 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80fb141b-48f4-4f70-afd8-78fdb7b3c20c" (UID: "80fb141b-48f4-4f70-afd8-78fdb7b3c20c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.731528 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kflkk\" (UniqueName: \"kubernetes.io/projected/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-kube-api-access-kflkk\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.731573 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.731587 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fb141b-48f4-4f70-afd8-78fdb7b3c20c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.739821 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833160 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content\") pod \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833246 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics\") pod \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833279 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca\") pod \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833300 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6fx\" (UniqueName: \"kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx\") pod \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833327 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content\") pod \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833346 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities\") pod \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\" (UID: \"e4aae2be-27f2-43df-a96d-9d2fbc198a6f\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833401 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr85h\" (UniqueName: \"kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h\") pod \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833423 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bsk8\" (UniqueName: \"kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8\") pod \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\" (UID: \"4b25a40b-8bba-423f-b3fa-5b58e3d18423\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.833470 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities\") pod \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\" (UID: \"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf\") " Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.834356 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities" (OuterVolumeSpecName: "utilities") pod "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" (UID: "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.835524 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4b25a40b-8bba-423f-b3fa-5b58e3d18423" (UID: "4b25a40b-8bba-423f-b3fa-5b58e3d18423"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.837699 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities" (OuterVolumeSpecName: "utilities") pod "e4aae2be-27f2-43df-a96d-9d2fbc198a6f" (UID: "e4aae2be-27f2-43df-a96d-9d2fbc198a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.840350 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h" (OuterVolumeSpecName: "kube-api-access-fr85h") pod "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" (UID: "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf"). InnerVolumeSpecName "kube-api-access-fr85h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.840578 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8" (OuterVolumeSpecName: "kube-api-access-8bsk8") pod "4b25a40b-8bba-423f-b3fa-5b58e3d18423" (UID: "4b25a40b-8bba-423f-b3fa-5b58e3d18423"). InnerVolumeSpecName "kube-api-access-8bsk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.840822 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx" (OuterVolumeSpecName: "kube-api-access-qs6fx") pod "e4aae2be-27f2-43df-a96d-9d2fbc198a6f" (UID: "e4aae2be-27f2-43df-a96d-9d2fbc198a6f"). InnerVolumeSpecName "kube-api-access-qs6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.843050 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4b25a40b-8bba-423f-b3fa-5b58e3d18423" (UID: "4b25a40b-8bba-423f-b3fa-5b58e3d18423"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.860139 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" (UID: "8ef1517f-01d9-44a7-a73c-9ed3d727a8cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.899872 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4aae2be-27f2-43df-a96d-9d2fbc198a6f" (UID: "e4aae2be-27f2-43df-a96d-9d2fbc198a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935018 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935073 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935089 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935101 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b25a40b-8bba-423f-b3fa-5b58e3d18423-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935110 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6fx\" (UniqueName: \"kubernetes.io/projected/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-kube-api-access-qs6fx\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935121 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935129 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4aae2be-27f2-43df-a96d-9d2fbc198a6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935138 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr85h\" (UniqueName: \"kubernetes.io/projected/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf-kube-api-access-fr85h\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:01 crc kubenswrapper[4903]: I1202 23:03:01.935171 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bsk8\" (UniqueName: \"kubernetes.io/projected/4b25a40b-8bba-423f-b3fa-5b58e3d18423-kube-api-access-8bsk8\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.072599 4903 generic.go:334] "Generic (PLEG): container finished" podID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerID="402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.072741 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerDied","Data":"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.072778 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4mbf" event={"ID":"80fb141b-48f4-4f70-afd8-78fdb7b3c20c","Type":"ContainerDied","Data":"1c593e2ccc4503ae0edd1569e3d5f4af87d725b730ef9671fd516b5ed6a1250e"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.072802 4903 scope.go:117] "RemoveContainer" containerID="402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.072997 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4mbf" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.078281 4903 generic.go:334] "Generic (PLEG): container finished" podID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerID="6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.078376 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerDied","Data":"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.078398 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4ft5" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.078428 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4ft5" event={"ID":"8ef1517f-01d9-44a7-a73c-9ed3d727a8cf","Type":"ContainerDied","Data":"14c02f36c3b801109d1919df469ac3176bf476d22e9435b9cf85f69fadc6d218"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.085074 4903 generic.go:334] "Generic (PLEG): container finished" podID="e0630099-58c0-41ce-8817-013f7bff4749" containerID="64a4e1096c14f023200f333616062d7e44375a2823d7b9bd32202fe9f928d04c" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.085154 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerDied","Data":"64a4e1096c14f023200f333616062d7e44375a2823d7b9bd32202fe9f928d04c"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.096520 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.096581 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" event={"ID":"4b25a40b-8bba-423f-b3fa-5b58e3d18423","Type":"ContainerDied","Data":"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.095735 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerID="425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.097843 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vvxf5" event={"ID":"4b25a40b-8bba-423f-b3fa-5b58e3d18423","Type":"ContainerDied","Data":"2e94ce108d3d9702a07135e7120271834496b55dae4f58b23fe8159975faf6e3"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.103697 4903 scope.go:117] "RemoveContainer" containerID="76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.108857 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.111235 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v4mbf"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.117366 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerDied","Data":"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.117419 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krzz4" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.131308 4903 generic.go:334] "Generic (PLEG): container finished" podID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerID="965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43" exitCode=0 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.141576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krzz4" event={"ID":"e4aae2be-27f2-43df-a96d-9d2fbc198a6f","Type":"ContainerDied","Data":"475931a9757dbb63a10d586dd974c6f6197c54d82c62298aede2a168768cf6fc"} Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.141601 4903 scope.go:117] "RemoveContainer" containerID="d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.148968 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.158883 4903 scope.go:117] "RemoveContainer" containerID="402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.159417 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1\": container with ID starting with 402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1 not found: ID does not exist" containerID="402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.159472 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1"} err="failed to get container status \"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1\": rpc error: code = NotFound desc = could not find container \"402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1\": container with ID starting with 402263281be29214692118ff4901d364dee26d395a5286bbdf74ebc70a5c2df1 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.159502 4903 scope.go:117] "RemoveContainer" containerID="76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.160279 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185\": container with ID starting with 76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185 not found: ID does not exist" containerID="76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.160310 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185"} err="failed to get container status \"76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185\": rpc error: code = NotFound desc = could not find container \"76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185\": container with ID starting with 76fac451defcd4e4a289d8a653d647298ed5c69c8ad45928feaaaf0599fd7185 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.160337 4903 scope.go:117] "RemoveContainer" containerID="d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.160402 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4ft5"] Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.160588 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604\": container with ID starting with d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604 not found: ID does not exist" containerID="d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.160620 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604"} err="failed to get container status \"d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604\": rpc error: code = NotFound desc = could not find container \"d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604\": container with ID starting with d4ad12c577a52b44781468ea48e010b594aa53d53214509bb5da681cd31da604 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.160643 4903 scope.go:117] "RemoveContainer" containerID="6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.167569 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.173996 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vvxf5"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.190449 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.193986 4903 scope.go:117] "RemoveContainer" containerID="0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.194790 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krzz4"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.220749 4903 scope.go:117] "RemoveContainer" containerID="a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.235114 4903 scope.go:117] "RemoveContainer" containerID="6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.235519 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd\": container with ID starting with 6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd not found: ID does not exist" containerID="6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.235552 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd"} err="failed to get container status \"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd\": rpc error: code = NotFound desc = could not find container \"6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd\": container with ID starting with 6d99a9369b95e8e5c6762d82560470348a34b9ccd76d7a4c55045a7cd2fb34cd not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.235573 4903 scope.go:117] "RemoveContainer" containerID="0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.236053 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b\": container with ID starting with 0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b not found: ID does not exist" containerID="0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.236086 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b"} err="failed to get container status \"0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b\": rpc error: code = NotFound desc = could not find container \"0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b\": container with ID starting with 0e9822969a856de25458d4a09f420ba15757a2de815033b71bb09c90f998669b not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.236101 4903 scope.go:117] "RemoveContainer" containerID="a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.236953 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd\": container with ID starting with a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd not found: ID does not exist" containerID="a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.237037 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd"} err="failed to get container status \"a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd\": rpc error: code = NotFound desc = could not find container \"a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd\": container with ID starting with a73d6ef0b4490f99b9537684986e7909640b75ed5fa0fc7d41880f3d2764bafd not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.237127 4903 scope.go:117] "RemoveContainer" containerID="425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.252245 4903 scope.go:117] "RemoveContainer" containerID="425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.252853 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4\": container with ID starting with 425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4 not found: ID does not exist" containerID="425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.252894 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4"} err="failed to get container status \"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4\": rpc error: code = NotFound desc = could not find container \"425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4\": container with ID starting with 425252b3d54d88b331a1195e844b19a8be7315f6698fd26451d22b7e604fb1c4 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.252924 4903 scope.go:117] "RemoveContainer" containerID="965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.269259 4903 scope.go:117] "RemoveContainer" containerID="495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.289206 4903 scope.go:117] "RemoveContainer" containerID="09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.312249 4903 scope.go:117] "RemoveContainer" containerID="965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.312856 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43\": container with ID starting with 965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43 not found: ID does not exist" containerID="965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.312888 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43"} err="failed to get container status \"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43\": rpc error: code = NotFound desc = could not find container \"965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43\": container with ID starting with 965ec8eeaf96f09d7063fd981687142312ee567aa0b09f9b74a6ea81b72dfa43 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.312910 4903 scope.go:117] "RemoveContainer" containerID="495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.313463 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b\": container with ID starting with 495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b not found: ID does not exist" containerID="495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.313497 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b"} err="failed to get container status \"495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b\": rpc error: code = NotFound desc = could not find container \"495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b\": container with ID starting with 495514f2d64573dcb6ab8007d870a00b94dc16c8ddf597a7af14504e8149031b not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.313519 4903 scope.go:117] "RemoveContainer" containerID="09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.314186 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65\": container with ID starting with 09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65 not found: ID does not exist" containerID="09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.314217 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65"} err="failed to get container status \"09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65\": rpc error: code = NotFound desc = could not find container \"09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65\": container with ID starting with 09ad82bc3b64914469668271709378b4ed594273dcd03f21a7d93590c57dcb65 not found: ID does not exist" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.520262 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.644646 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content\") pod \"e0630099-58c0-41ce-8817-013f7bff4749\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.644718 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities\") pod \"e0630099-58c0-41ce-8817-013f7bff4749\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.644751 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pknkc\" (UniqueName: \"kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc\") pod \"e0630099-58c0-41ce-8817-013f7bff4749\" (UID: \"e0630099-58c0-41ce-8817-013f7bff4749\") " Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.645580 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities" (OuterVolumeSpecName: "utilities") pod "e0630099-58c0-41ce-8817-013f7bff4749" (UID: "e0630099-58c0-41ce-8817-013f7bff4749"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.650800 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc" (OuterVolumeSpecName: "kube-api-access-pknkc") pod "e0630099-58c0-41ce-8817-013f7bff4749" (UID: "e0630099-58c0-41ce-8817-013f7bff4749"). InnerVolumeSpecName "kube-api-access-pknkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691235 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bxtgc"] Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691495 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691512 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691522 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691530 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691543 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691552 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691563 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691571 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691594 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691603 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691618 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691626 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.691637 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.691644 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.693973 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.693998 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694016 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694025 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694056 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694063 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694073 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" containerName="installer" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694080 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" containerName="installer" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694090 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694099 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694108 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694115 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="extract-utilities" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694123 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694129 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: E1202 23:03:02.694140 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694149 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="extract-content" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694292 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0630099-58c0-41ce-8817-013f7bff4749" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694302 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694312 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7bed574-70f1-4907-be3c-6cd655b3df0a" containerName="installer" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694319 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694330 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694338 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" containerName="registry-server" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.694349 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" containerName="marketplace-operator" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.695273 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.699587 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.715882 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cp8sg"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.716512 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.718764 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxtgc"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.741975 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.746301 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-catalog-content\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.746363 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjfq\" (UniqueName: \"kubernetes.io/projected/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-kube-api-access-grjfq\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.746383 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-utilities\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.746437 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.746448 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pknkc\" (UniqueName: \"kubernetes.io/projected/e0630099-58c0-41ce-8817-013f7bff4749-kube-api-access-pknkc\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.749763 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cp8sg"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.750309 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.759613 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.817693 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.817883 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" podUID="896cc150-3871-46e2-b1f5-c31c25c54014" containerName="route-controller-manager" containerID="cri-o://6fe2a4ab72f9039606d0642d08e6686b3942fdb7a81766cb7b30b3f3532dbea4" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.823913 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0630099-58c0-41ce-8817-013f7bff4749" (UID: "e0630099-58c0-41ce-8817-013f7bff4749"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847540 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c62tn\" (UniqueName: \"kubernetes.io/projected/b6ef141a-9183-423b-85e6-e7a02cc32267-kube-api-access-c62tn\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847619 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-catalog-content\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847666 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847723 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjfq\" (UniqueName: \"kubernetes.io/projected/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-kube-api-access-grjfq\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-utilities\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847778 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.847827 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0630099-58c0-41ce-8817-013f7bff4749-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.848360 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-catalog-content\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.848949 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-utilities\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.875684 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.875889 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" podUID="2c5439fc-734f-4efa-838f-68900d9453ec" containerName="controller-manager" containerID="cri-o://99882b84c258285cb6dbb70a2017e661d6b30f59ae5662a7db17d88fa193b212" gracePeriod=30 Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.908174 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gqq5v"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.908818 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.913549 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjfq\" (UniqueName: \"kubernetes.io/projected/68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa-kube-api-access-grjfq\") pod \"redhat-marketplace-bxtgc\" (UID: \"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa\") " pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.927122 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gqq5v"] Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.949315 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.949399 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.949440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c62tn\" (UniqueName: \"kubernetes.io/projected/b6ef141a-9183-423b-85e6-e7a02cc32267-kube-api-access-c62tn\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.950576 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.952926 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6ef141a-9183-423b-85e6-e7a02cc32267-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:02 crc kubenswrapper[4903]: I1202 23:03:02.973925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c62tn\" (UniqueName: \"kubernetes.io/projected/b6ef141a-9183-423b-85e6-e7a02cc32267-kube-api-access-c62tn\") pod \"marketplace-operator-79b997595-cp8sg\" (UID: \"b6ef141a-9183-423b-85e6-e7a02cc32267\") " pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.010272 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.045793 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.052831 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-trusted-ca\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.052894 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.052934 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.052964 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-tls\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.052991 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.053019 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm5d\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-kube-api-access-4wm5d\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.053042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-bound-sa-token\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.053072 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-certificates\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.124406 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.153967 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154026 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm5d\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-kube-api-access-4wm5d\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-bound-sa-token\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-certificates\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-trusted-ca\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154148 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154174 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-tls\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.154464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.156260 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-trusted-ca\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.157766 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-certificates\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.159691 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.165633 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-registry-tls\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.172381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-bound-sa-token\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.176167 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bdg" event={"ID":"e0630099-58c0-41ce-8817-013f7bff4749","Type":"ContainerDied","Data":"1a2c0437c8de03ea66052d0720bc8b8497e556955fb1436924522024c1bef57a"} Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.176218 4903 scope.go:117] "RemoveContainer" containerID="64a4e1096c14f023200f333616062d7e44375a2823d7b9bd32202fe9f928d04c" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.176380 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bdg" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.179011 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm5d\" (UniqueName: \"kubernetes.io/projected/b0a86c29-9c44-43ee-ac72-c0c38ae11f5e-kube-api-access-4wm5d\") pod \"image-registry-66df7c8f76-gqq5v\" (UID: \"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e\") " pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.196742 4903 generic.go:334] "Generic (PLEG): container finished" podID="896cc150-3871-46e2-b1f5-c31c25c54014" containerID="6fe2a4ab72f9039606d0642d08e6686b3942fdb7a81766cb7b30b3f3532dbea4" exitCode=0 Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.196873 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" event={"ID":"896cc150-3871-46e2-b1f5-c31c25c54014","Type":"ContainerDied","Data":"6fe2a4ab72f9039606d0642d08e6686b3942fdb7a81766cb7b30b3f3532dbea4"} Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.198517 4903 generic.go:334] "Generic (PLEG): container finished" podID="2c5439fc-734f-4efa-838f-68900d9453ec" containerID="99882b84c258285cb6dbb70a2017e661d6b30f59ae5662a7db17d88fa193b212" exitCode=0 Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.198543 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" event={"ID":"2c5439fc-734f-4efa-838f-68900d9453ec","Type":"ContainerDied","Data":"99882b84c258285cb6dbb70a2017e661d6b30f59ae5662a7db17d88fa193b212"} Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.232612 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.242397 4903 scope.go:117] "RemoveContainer" containerID="7256137130cfe7d17552bcd557ba6e174ed9c06028778a5d287d4d4a748cbf4d" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.246155 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8bdg"] Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.259691 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.267881 4903 scope.go:117] "RemoveContainer" containerID="c4ab90ab9fbe1c838cab684df58365cbbb7a9fbdc9349ad39af842c2c4e166ab" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.318517 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.321769 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkpl\" (UniqueName: \"kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl\") pod \"2c5439fc-734f-4efa-838f-68900d9453ec\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356360 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca\") pod \"896cc150-3871-46e2-b1f5-c31c25c54014\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356414 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles\") pod \"2c5439fc-734f-4efa-838f-68900d9453ec\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356439 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca\") pod \"2c5439fc-734f-4efa-838f-68900d9453ec\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356458 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert\") pod \"896cc150-3871-46e2-b1f5-c31c25c54014\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config\") pod \"896cc150-3871-46e2-b1f5-c31c25c54014\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert\") pod \"2c5439fc-734f-4efa-838f-68900d9453ec\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356507 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config\") pod \"2c5439fc-734f-4efa-838f-68900d9453ec\" (UID: \"2c5439fc-734f-4efa-838f-68900d9453ec\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.356563 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skx5p\" (UniqueName: \"kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p\") pod \"896cc150-3871-46e2-b1f5-c31c25c54014\" (UID: \"896cc150-3871-46e2-b1f5-c31c25c54014\") " Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.357237 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2c5439fc-734f-4efa-838f-68900d9453ec" (UID: "2c5439fc-734f-4efa-838f-68900d9453ec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.357625 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config" (OuterVolumeSpecName: "config") pod "896cc150-3871-46e2-b1f5-c31c25c54014" (UID: "896cc150-3871-46e2-b1f5-c31c25c54014"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.358249 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c5439fc-734f-4efa-838f-68900d9453ec" (UID: "2c5439fc-734f-4efa-838f-68900d9453ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.358575 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca" (OuterVolumeSpecName: "client-ca") pod "896cc150-3871-46e2-b1f5-c31c25c54014" (UID: "896cc150-3871-46e2-b1f5-c31c25c54014"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.358689 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.358705 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.358716 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.359817 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config" (OuterVolumeSpecName: "config") pod "2c5439fc-734f-4efa-838f-68900d9453ec" (UID: "2c5439fc-734f-4efa-838f-68900d9453ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.362466 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "896cc150-3871-46e2-b1f5-c31c25c54014" (UID: "896cc150-3871-46e2-b1f5-c31c25c54014"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.362670 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p" (OuterVolumeSpecName: "kube-api-access-skx5p") pod "896cc150-3871-46e2-b1f5-c31c25c54014" (UID: "896cc150-3871-46e2-b1f5-c31c25c54014"). InnerVolumeSpecName "kube-api-access-skx5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.362835 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl" (OuterVolumeSpecName: "kube-api-access-7tkpl") pod "2c5439fc-734f-4efa-838f-68900d9453ec" (UID: "2c5439fc-734f-4efa-838f-68900d9453ec"). InnerVolumeSpecName "kube-api-access-7tkpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.363199 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c5439fc-734f-4efa-838f-68900d9453ec" (UID: "2c5439fc-734f-4efa-838f-68900d9453ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.401612 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cp8sg"] Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460068 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkpl\" (UniqueName: \"kubernetes.io/projected/2c5439fc-734f-4efa-838f-68900d9453ec-kube-api-access-7tkpl\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460096 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/896cc150-3871-46e2-b1f5-c31c25c54014-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460105 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/896cc150-3871-46e2-b1f5-c31c25c54014-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460114 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5439fc-734f-4efa-838f-68900d9453ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460122 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5439fc-734f-4efa-838f-68900d9453ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.460131 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skx5p\" (UniqueName: \"kubernetes.io/projected/896cc150-3871-46e2-b1f5-c31c25c54014-kube-api-access-skx5p\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.532254 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxtgc"] Dec 02 23:03:03 crc kubenswrapper[4903]: W1202 23:03:03.544821 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d3d00d_e9e1_4fd4_a290_2c49ec9aebaa.slice/crio-de9366a536154e2323f1f0faf6bd3e5a85332bc3cb50897dbee180931e2f540b WatchSource:0}: Error finding container de9366a536154e2323f1f0faf6bd3e5a85332bc3cb50897dbee180931e2f540b: Status 404 returned error can't find the container with id de9366a536154e2323f1f0faf6bd3e5a85332bc3cb50897dbee180931e2f540b Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.619821 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b25a40b-8bba-423f-b3fa-5b58e3d18423" path="/var/lib/kubelet/pods/4b25a40b-8bba-423f-b3fa-5b58e3d18423/volumes" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.620526 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fb141b-48f4-4f70-afd8-78fdb7b3c20c" path="/var/lib/kubelet/pods/80fb141b-48f4-4f70-afd8-78fdb7b3c20c/volumes" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.621105 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef1517f-01d9-44a7-a73c-9ed3d727a8cf" path="/var/lib/kubelet/pods/8ef1517f-01d9-44a7-a73c-9ed3d727a8cf/volumes" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.622121 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0630099-58c0-41ce-8817-013f7bff4749" path="/var/lib/kubelet/pods/e0630099-58c0-41ce-8817-013f7bff4749/volumes" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.622889 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4aae2be-27f2-43df-a96d-9d2fbc198a6f" path="/var/lib/kubelet/pods/e4aae2be-27f2-43df-a96d-9d2fbc198a6f/volumes" Dec 02 23:03:03 crc kubenswrapper[4903]: I1202 23:03:03.663402 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gqq5v"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.206032 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" event={"ID":"b6ef141a-9183-423b-85e6-e7a02cc32267","Type":"ContainerStarted","Data":"900aef8ab31c7cbc95c54a2373f67471cebe7726472990f680942c9d95d5d4fb"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.206084 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" event={"ID":"b6ef141a-9183-423b-85e6-e7a02cc32267","Type":"ContainerStarted","Data":"67e9dde2c83dea45da347fa0cbd462c12236271c09cdaca5c3eb9ff05a06fec7"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.207473 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.210449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" event={"ID":"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e","Type":"ContainerStarted","Data":"d7218bb1f4f4a4eb51d92c8e64ffb493e2675fe97d74d3aedd66931c7e4c8eda"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.210537 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.210551 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.210560 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" event={"ID":"b0a86c29-9c44-43ee-ac72-c0c38ae11f5e","Type":"ContainerStarted","Data":"40644d7adcca0e5c2f373cdefbaa7c745a872e827f8afd01315b0825270ee4a1"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.212098 4903 generic.go:334] "Generic (PLEG): container finished" podID="68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa" containerID="161cde3cd5f20f8ac77ed00043221706732a5a3e1a16f3281635d488c1a74c5e" exitCode=0 Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.212180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxtgc" event={"ID":"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa","Type":"ContainerDied","Data":"161cde3cd5f20f8ac77ed00043221706732a5a3e1a16f3281635d488c1a74c5e"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.212209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxtgc" event={"ID":"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa","Type":"ContainerStarted","Data":"de9366a536154e2323f1f0faf6bd3e5a85332bc3cb50897dbee180931e2f540b"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.214015 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" event={"ID":"896cc150-3871-46e2-b1f5-c31c25c54014","Type":"ContainerDied","Data":"91af0c717c1c25bb2e045696d2cb66627ca939c648aee25c3fc54bb502a39644"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.214022 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.214056 4903 scope.go:117] "RemoveContainer" containerID="6fe2a4ab72f9039606d0642d08e6686b3942fdb7a81766cb7b30b3f3532dbea4" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.217306 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.217422 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9mlb7" event={"ID":"2c5439fc-734f-4efa-838f-68900d9453ec","Type":"ContainerDied","Data":"d1590f5d08d5021f7c4653ff6d53411599ec2a40aa5f6db4e0019af55c730e6e"} Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.235236 4903 scope.go:117] "RemoveContainer" containerID="99882b84c258285cb6dbb70a2017e661d6b30f59ae5662a7db17d88fa193b212" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.238205 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cp8sg" podStartSLOduration=2.238189369 podStartE2EDuration="2.238189369s" podCreationTimestamp="2025-12-02 23:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:04.227959474 +0000 UTC m=+322.936513757" watchObservedRunningTime="2025-12-02 23:03:04.238189369 +0000 UTC m=+322.946743652" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.238839 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.242359 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pnzr7"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.270111 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" podStartSLOduration=2.270093487 podStartE2EDuration="2.270093487s" podCreationTimestamp="2025-12-02 23:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:04.268498496 +0000 UTC m=+322.977052779" watchObservedRunningTime="2025-12-02 23:03:04.270093487 +0000 UTC m=+322.978647760" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.295048 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67bbd76cc4-l265m"] Dec 02 23:03:04 crc kubenswrapper[4903]: E1202 23:03:04.295229 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5439fc-734f-4efa-838f-68900d9453ec" containerName="controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.295241 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5439fc-734f-4efa-838f-68900d9453ec" containerName="controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: E1202 23:03:04.295252 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896cc150-3871-46e2-b1f5-c31c25c54014" containerName="route-controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.295258 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="896cc150-3871-46e2-b1f5-c31c25c54014" containerName="route-controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.298103 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5439fc-734f-4efa-838f-68900d9453ec" containerName="controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.298136 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="896cc150-3871-46e2-b1f5-c31c25c54014" containerName="route-controller-manager" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.298471 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.302629 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.302799 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.302900 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.305269 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.307446 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.308592 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.308948 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.321954 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.325529 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.338956 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.339751 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.339954 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.340141 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.340350 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.341013 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.341325 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67bbd76cc4-l265m"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.352188 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.367601 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370373 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d819e-a4c6-48d1-8e9c-efdf9d74504e-serving-cert\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370502 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-config\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370583 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-proxy-ca-bundles\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370673 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkszl\" (UniqueName: \"kubernetes.io/projected/689d819e-a4c6-48d1-8e9c-efdf9d74504e-kube-api-access-hkszl\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370751 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-client-ca\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.370918 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.371009 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.371079 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gvl\" (UniqueName: \"kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.371159 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.371317 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9mlb7"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472439 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d819e-a4c6-48d1-8e9c-efdf9d74504e-serving-cert\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472484 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-config\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472508 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-proxy-ca-bundles\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkszl\" (UniqueName: \"kubernetes.io/projected/689d819e-a4c6-48d1-8e9c-efdf9d74504e-kube-api-access-hkszl\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472548 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-client-ca\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472572 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472602 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472617 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gvl\" (UniqueName: \"kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.472637 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.473700 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.474276 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-client-ca\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.474899 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.475381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-config\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.476328 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/689d819e-a4c6-48d1-8e9c-efdf9d74504e-proxy-ca-bundles\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.478862 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689d819e-a4c6-48d1-8e9c-efdf9d74504e-serving-cert\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.488373 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkszl\" (UniqueName: \"kubernetes.io/projected/689d819e-a4c6-48d1-8e9c-efdf9d74504e-kube-api-access-hkszl\") pod \"controller-manager-67bbd76cc4-l265m\" (UID: \"689d819e-a4c6-48d1-8e9c-efdf9d74504e\") " pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.490937 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gvl\" (UniqueName: \"kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.492945 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert\") pod \"route-controller-manager-5d7874b474-b7c2j\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.623329 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.660245 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.900367 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67bbd76cc4-l265m"] Dec 02 23:03:04 crc kubenswrapper[4903]: I1202 23:03:04.935624 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:04 crc kubenswrapper[4903]: W1202 23:03:04.952219 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1919c7_cb91_4c84_9bf7_587aa1363fdf.slice/crio-03afe8fcf7efea9ce4e886a1681f116dab81e06a89df7a1221edc31cbecbcf2d WatchSource:0}: Error finding container 03afe8fcf7efea9ce4e886a1681f116dab81e06a89df7a1221edc31cbecbcf2d: Status 404 returned error can't find the container with id 03afe8fcf7efea9ce4e886a1681f116dab81e06a89df7a1221edc31cbecbcf2d Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.077194 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtsgk"] Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.078344 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.080458 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.090293 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtsgk"] Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.185536 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-utilities\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.185623 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-catalog-content\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.185673 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8bh\" (UniqueName: \"kubernetes.io/projected/716d2470-b915-4dc5-b728-8b5c047e4df6-kube-api-access-xx8bh\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.236840 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxtgc" event={"ID":"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa","Type":"ContainerStarted","Data":"107c31927e7173e12a5dec7834297a4fb8a34f7b134557c89f8b9db8a836114f"} Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.242057 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" event={"ID":"689d819e-a4c6-48d1-8e9c-efdf9d74504e","Type":"ContainerStarted","Data":"bfff3fdc90e942d36ca9043eff87e1a7fb2f33e7b0b57146fb16ec99dbadd4ac"} Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.242094 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" event={"ID":"689d819e-a4c6-48d1-8e9c-efdf9d74504e","Type":"ContainerStarted","Data":"c177fd1fc71c07bfaf446b46b0780ab3a174a0bfbf91aff8c0e0c5b837dbcb9c"} Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.242766 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.245794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" event={"ID":"af1919c7-cb91-4c84-9bf7-587aa1363fdf","Type":"ContainerStarted","Data":"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9"} Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.245829 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" event={"ID":"af1919c7-cb91-4c84-9bf7-587aa1363fdf","Type":"ContainerStarted","Data":"03afe8fcf7efea9ce4e886a1681f116dab81e06a89df7a1221edc31cbecbcf2d"} Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.250468 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.286270 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-catalog-content\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.286376 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx8bh\" (UniqueName: \"kubernetes.io/projected/716d2470-b915-4dc5-b728-8b5c047e4df6-kube-api-access-xx8bh\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.286416 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-utilities\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.287212 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cz9pq"] Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.287858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-catalog-content\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.289728 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d2470-b915-4dc5-b728-8b5c047e4df6-utilities\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.298007 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.303078 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.320855 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cz9pq"] Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.320899 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67bbd76cc4-l265m" podStartSLOduration=3.3208875669999998 podStartE2EDuration="3.320887567s" podCreationTimestamp="2025-12-02 23:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:05.305116647 +0000 UTC m=+324.013670930" watchObservedRunningTime="2025-12-02 23:03:05.320887567 +0000 UTC m=+324.029441850" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.342047 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx8bh\" (UniqueName: \"kubernetes.io/projected/716d2470-b915-4dc5-b728-8b5c047e4df6-kube-api-access-xx8bh\") pod \"certified-operators-dtsgk\" (UID: \"716d2470-b915-4dc5-b728-8b5c047e4df6\") " pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.385540 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" podStartSLOduration=3.385524954 podStartE2EDuration="3.385524954s" podCreationTimestamp="2025-12-02 23:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:05.346967683 +0000 UTC m=+324.055521966" watchObservedRunningTime="2025-12-02 23:03:05.385524954 +0000 UTC m=+324.094079237" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.404276 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.488069 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-catalog-content\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.488137 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-utilities\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.488175 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rfn\" (UniqueName: \"kubernetes.io/projected/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-kube-api-access-k2rfn\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.589341 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-catalog-content\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.589418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-utilities\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.589466 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rfn\" (UniqueName: \"kubernetes.io/projected/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-kube-api-access-k2rfn\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.589850 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-utilities\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.589859 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-catalog-content\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.606955 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rfn\" (UniqueName: \"kubernetes.io/projected/4232cb32-9d6c-400d-9deb-8fb5a18f20f8-kube-api-access-k2rfn\") pod \"community-operators-cz9pq\" (UID: \"4232cb32-9d6c-400d-9deb-8fb5a18f20f8\") " pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.618058 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c5439fc-734f-4efa-838f-68900d9453ec" path="/var/lib/kubelet/pods/2c5439fc-734f-4efa-838f-68900d9453ec/volumes" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.618725 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896cc150-3871-46e2-b1f5-c31c25c54014" path="/var/lib/kubelet/pods/896cc150-3871-46e2-b1f5-c31c25c54014/volumes" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.619240 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.885288 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cz9pq"] Dec 02 23:03:05 crc kubenswrapper[4903]: W1202 23:03:05.898429 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4232cb32_9d6c_400d_9deb_8fb5a18f20f8.slice/crio-88dbca7c17254db048e12a908e0b8e6ce81849a9a1aa783454dca493d44b55f6 WatchSource:0}: Error finding container 88dbca7c17254db048e12a908e0b8e6ce81849a9a1aa783454dca493d44b55f6: Status 404 returned error can't find the container with id 88dbca7c17254db048e12a908e0b8e6ce81849a9a1aa783454dca493d44b55f6 Dec 02 23:03:05 crc kubenswrapper[4903]: I1202 23:03:05.999506 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtsgk"] Dec 02 23:03:06 crc kubenswrapper[4903]: W1202 23:03:06.005833 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716d2470_b915_4dc5_b728_8b5c047e4df6.slice/crio-729cd5de07efc26963c3ae98e31f9ad8625075bd1ae3f661a587771e918fd5b2 WatchSource:0}: Error finding container 729cd5de07efc26963c3ae98e31f9ad8625075bd1ae3f661a587771e918fd5b2: Status 404 returned error can't find the container with id 729cd5de07efc26963c3ae98e31f9ad8625075bd1ae3f661a587771e918fd5b2 Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.252737 4903 generic.go:334] "Generic (PLEG): container finished" podID="4232cb32-9d6c-400d-9deb-8fb5a18f20f8" containerID="7b9a0cb9ac7940551b472a4b3fb9bfc3c0e28a31c9b846dcd352dc2541ecb6d0" exitCode=0 Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.252827 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cz9pq" event={"ID":"4232cb32-9d6c-400d-9deb-8fb5a18f20f8","Type":"ContainerDied","Data":"7b9a0cb9ac7940551b472a4b3fb9bfc3c0e28a31c9b846dcd352dc2541ecb6d0"} Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.252862 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cz9pq" event={"ID":"4232cb32-9d6c-400d-9deb-8fb5a18f20f8","Type":"ContainerStarted","Data":"88dbca7c17254db048e12a908e0b8e6ce81849a9a1aa783454dca493d44b55f6"} Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.256580 4903 generic.go:334] "Generic (PLEG): container finished" podID="716d2470-b915-4dc5-b728-8b5c047e4df6" containerID="62a35388d5e7f21afa531cafe58f2099faccee70c3d758ab119ea6b20199fc26" exitCode=0 Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.256644 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtsgk" event={"ID":"716d2470-b915-4dc5-b728-8b5c047e4df6","Type":"ContainerDied","Data":"62a35388d5e7f21afa531cafe58f2099faccee70c3d758ab119ea6b20199fc26"} Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.256688 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtsgk" event={"ID":"716d2470-b915-4dc5-b728-8b5c047e4df6","Type":"ContainerStarted","Data":"729cd5de07efc26963c3ae98e31f9ad8625075bd1ae3f661a587771e918fd5b2"} Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.260299 4903 generic.go:334] "Generic (PLEG): container finished" podID="68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa" containerID="107c31927e7173e12a5dec7834297a4fb8a34f7b134557c89f8b9db8a836114f" exitCode=0 Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.260540 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxtgc" event={"ID":"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa","Type":"ContainerDied","Data":"107c31927e7173e12a5dec7834297a4fb8a34f7b134557c89f8b9db8a836114f"} Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.261372 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.266574 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.877304 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v87tp"] Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.878660 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.881000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 23:03:06 crc kubenswrapper[4903]: I1202 23:03:06.892570 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v87tp"] Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.006867 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd665\" (UniqueName: \"kubernetes.io/projected/2708b032-25bc-4098-9b51-71a186f0ac30-kube-api-access-nd665\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.006964 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-catalog-content\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.007015 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-utilities\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.108054 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-catalog-content\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.108120 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-utilities\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.108166 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd665\" (UniqueName: \"kubernetes.io/projected/2708b032-25bc-4098-9b51-71a186f0ac30-kube-api-access-nd665\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.108636 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-utilities\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.108635 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2708b032-25bc-4098-9b51-71a186f0ac30-catalog-content\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.128839 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd665\" (UniqueName: \"kubernetes.io/projected/2708b032-25bc-4098-9b51-71a186f0ac30-kube-api-access-nd665\") pod \"redhat-operators-v87tp\" (UID: \"2708b032-25bc-4098-9b51-71a186f0ac30\") " pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.237027 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.267725 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cz9pq" event={"ID":"4232cb32-9d6c-400d-9deb-8fb5a18f20f8","Type":"ContainerStarted","Data":"f56f4d050a7f89fc32fd31f22294afd404959e4008ffafc94e17bcfd6fd41ed0"} Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.270009 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtsgk" event={"ID":"716d2470-b915-4dc5-b728-8b5c047e4df6","Type":"ContainerStarted","Data":"b7e16fa606a7718f8c6d25a964864aadef094ee38937122733d89244d93750a5"} Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.272760 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxtgc" event={"ID":"68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa","Type":"ContainerStarted","Data":"bbdc22d1f4a17ba5f182db5bdcf03e9b0d2f5055d115608ebc4e5793a04e5fe8"} Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.330309 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bxtgc" podStartSLOduration=2.71714683 podStartE2EDuration="5.330289953s" podCreationTimestamp="2025-12-02 23:03:02 +0000 UTC" firstStartedPulling="2025-12-02 23:03:04.214851064 +0000 UTC m=+322.923405347" lastFinishedPulling="2025-12-02 23:03:06.827994197 +0000 UTC m=+325.536548470" observedRunningTime="2025-12-02 23:03:07.308563808 +0000 UTC m=+326.017118091" watchObservedRunningTime="2025-12-02 23:03:07.330289953 +0000 UTC m=+326.038844236" Dec 02 23:03:07 crc kubenswrapper[4903]: I1202 23:03:07.655995 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v87tp"] Dec 02 23:03:07 crc kubenswrapper[4903]: W1202 23:03:07.665036 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2708b032_25bc_4098_9b51_71a186f0ac30.slice/crio-7681c2c46469a6eeeb09e09ab94c5d7bbc46e043e8099293c7198c69808f13c8 WatchSource:0}: Error finding container 7681c2c46469a6eeeb09e09ab94c5d7bbc46e043e8099293c7198c69808f13c8: Status 404 returned error can't find the container with id 7681c2c46469a6eeeb09e09ab94c5d7bbc46e043e8099293c7198c69808f13c8 Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.279364 4903 generic.go:334] "Generic (PLEG): container finished" podID="2708b032-25bc-4098-9b51-71a186f0ac30" containerID="3a82c23d761f70961bac0985e235345bb3a010f48258ca879d0e16ce04b2d095" exitCode=0 Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.279449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87tp" event={"ID":"2708b032-25bc-4098-9b51-71a186f0ac30","Type":"ContainerDied","Data":"3a82c23d761f70961bac0985e235345bb3a010f48258ca879d0e16ce04b2d095"} Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.279800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87tp" event={"ID":"2708b032-25bc-4098-9b51-71a186f0ac30","Type":"ContainerStarted","Data":"7681c2c46469a6eeeb09e09ab94c5d7bbc46e043e8099293c7198c69808f13c8"} Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.283134 4903 generic.go:334] "Generic (PLEG): container finished" podID="4232cb32-9d6c-400d-9deb-8fb5a18f20f8" containerID="f56f4d050a7f89fc32fd31f22294afd404959e4008ffafc94e17bcfd6fd41ed0" exitCode=0 Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.283472 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cz9pq" event={"ID":"4232cb32-9d6c-400d-9deb-8fb5a18f20f8","Type":"ContainerDied","Data":"f56f4d050a7f89fc32fd31f22294afd404959e4008ffafc94e17bcfd6fd41ed0"} Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.285731 4903 generic.go:334] "Generic (PLEG): container finished" podID="716d2470-b915-4dc5-b728-8b5c047e4df6" containerID="b7e16fa606a7718f8c6d25a964864aadef094ee38937122733d89244d93750a5" exitCode=0 Dec 02 23:03:08 crc kubenswrapper[4903]: I1202 23:03:08.286720 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtsgk" event={"ID":"716d2470-b915-4dc5-b728-8b5c047e4df6","Type":"ContainerDied","Data":"b7e16fa606a7718f8c6d25a964864aadef094ee38937122733d89244d93750a5"} Dec 02 23:03:09 crc kubenswrapper[4903]: I1202 23:03:09.292925 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cz9pq" event={"ID":"4232cb32-9d6c-400d-9deb-8fb5a18f20f8","Type":"ContainerStarted","Data":"9322ddfb1bdfb11c4163dcaef138fff45a069fb1f184905ac182794cef772bf8"} Dec 02 23:03:09 crc kubenswrapper[4903]: I1202 23:03:09.295551 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtsgk" event={"ID":"716d2470-b915-4dc5-b728-8b5c047e4df6","Type":"ContainerStarted","Data":"e50688f9fa7fd684610e029190110c93b235ae515296ac3c3b191e68eeae1daa"} Dec 02 23:03:09 crc kubenswrapper[4903]: I1202 23:03:09.298660 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87tp" event={"ID":"2708b032-25bc-4098-9b51-71a186f0ac30","Type":"ContainerStarted","Data":"46366c95a51d56d8756ed2d245a834925cf5d8af82c6458f214c4b82655a033a"} Dec 02 23:03:09 crc kubenswrapper[4903]: I1202 23:03:09.311242 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cz9pq" podStartSLOduration=1.809252326 podStartE2EDuration="4.311222238s" podCreationTimestamp="2025-12-02 23:03:05 +0000 UTC" firstStartedPulling="2025-12-02 23:03:06.254909016 +0000 UTC m=+324.963463299" lastFinishedPulling="2025-12-02 23:03:08.756878928 +0000 UTC m=+327.465433211" observedRunningTime="2025-12-02 23:03:09.308220337 +0000 UTC m=+328.016774630" watchObservedRunningTime="2025-12-02 23:03:09.311222238 +0000 UTC m=+328.019776521" Dec 02 23:03:09 crc kubenswrapper[4903]: I1202 23:03:09.328566 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtsgk" podStartSLOduration=1.883203008 podStartE2EDuration="4.32854809s" podCreationTimestamp="2025-12-02 23:03:05 +0000 UTC" firstStartedPulling="2025-12-02 23:03:06.259235298 +0000 UTC m=+324.967789581" lastFinishedPulling="2025-12-02 23:03:08.70458038 +0000 UTC m=+327.413134663" observedRunningTime="2025-12-02 23:03:09.326517806 +0000 UTC m=+328.035072119" watchObservedRunningTime="2025-12-02 23:03:09.32854809 +0000 UTC m=+328.037102373" Dec 02 23:03:10 crc kubenswrapper[4903]: I1202 23:03:10.307623 4903 generic.go:334] "Generic (PLEG): container finished" podID="2708b032-25bc-4098-9b51-71a186f0ac30" containerID="46366c95a51d56d8756ed2d245a834925cf5d8af82c6458f214c4b82655a033a" exitCode=0 Dec 02 23:03:10 crc kubenswrapper[4903]: I1202 23:03:10.307696 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87tp" event={"ID":"2708b032-25bc-4098-9b51-71a186f0ac30","Type":"ContainerDied","Data":"46366c95a51d56d8756ed2d245a834925cf5d8af82c6458f214c4b82655a033a"} Dec 02 23:03:12 crc kubenswrapper[4903]: I1202 23:03:12.320277 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v87tp" event={"ID":"2708b032-25bc-4098-9b51-71a186f0ac30","Type":"ContainerStarted","Data":"5ad3ec8f46968446dfdc04e136023afbb3a2c55681904b0631ba3b949c3f3be4"} Dec 02 23:03:12 crc kubenswrapper[4903]: I1202 23:03:12.335417 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v87tp" podStartSLOduration=2.957356284 podStartE2EDuration="6.335400211s" podCreationTimestamp="2025-12-02 23:03:06 +0000 UTC" firstStartedPulling="2025-12-02 23:03:08.281439856 +0000 UTC m=+326.989994149" lastFinishedPulling="2025-12-02 23:03:11.659483793 +0000 UTC m=+330.368038076" observedRunningTime="2025-12-02 23:03:12.334420155 +0000 UTC m=+331.042974438" watchObservedRunningTime="2025-12-02 23:03:12.335400211 +0000 UTC m=+331.043954494" Dec 02 23:03:13 crc kubenswrapper[4903]: I1202 23:03:13.010914 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:13 crc kubenswrapper[4903]: I1202 23:03:13.011319 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:13 crc kubenswrapper[4903]: I1202 23:03:13.077411 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:13 crc kubenswrapper[4903]: I1202 23:03:13.372857 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bxtgc" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.404517 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.404789 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.448960 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.622287 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.622336 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:15 crc kubenswrapper[4903]: I1202 23:03:15.659774 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:16 crc kubenswrapper[4903]: I1202 23:03:16.386080 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cz9pq" Dec 02 23:03:16 crc kubenswrapper[4903]: I1202 23:03:16.393237 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtsgk" Dec 02 23:03:17 crc kubenswrapper[4903]: I1202 23:03:17.237409 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:17 crc kubenswrapper[4903]: I1202 23:03:17.237507 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:18 crc kubenswrapper[4903]: I1202 23:03:18.280725 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:18 crc kubenswrapper[4903]: I1202 23:03:18.281042 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" podUID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" containerName="route-controller-manager" containerID="cri-o://ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9" gracePeriod=30 Dec 02 23:03:18 crc kubenswrapper[4903]: I1202 23:03:18.295985 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v87tp" podUID="2708b032-25bc-4098-9b51-71a186f0ac30" containerName="registry-server" probeResult="failure" output=< Dec 02 23:03:18 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:03:18 crc kubenswrapper[4903]: > Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.241831 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.361635 4903 generic.go:334] "Generic (PLEG): container finished" podID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" containerID="ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9" exitCode=0 Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.361703 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" event={"ID":"af1919c7-cb91-4c84-9bf7-587aa1363fdf","Type":"ContainerDied","Data":"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9"} Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.361733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" event={"ID":"af1919c7-cb91-4c84-9bf7-587aa1363fdf","Type":"ContainerDied","Data":"03afe8fcf7efea9ce4e886a1681f116dab81e06a89df7a1221edc31cbecbcf2d"} Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.361754 4903 scope.go:117] "RemoveContainer" containerID="ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.361877 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.376480 4903 scope.go:117] "RemoveContainer" containerID="ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9" Dec 02 23:03:19 crc kubenswrapper[4903]: E1202 23:03:19.377062 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9\": container with ID starting with ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9 not found: ID does not exist" containerID="ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.377112 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9"} err="failed to get container status \"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9\": rpc error: code = NotFound desc = could not find container \"ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9\": container with ID starting with ef5b80369ba88ca4b41570d6b6a56e680b1702124a6869a785b4b48c7207fae9 not found: ID does not exist" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.393280 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6gvl\" (UniqueName: \"kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl\") pod \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.393333 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert\") pod \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.393380 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config\") pod \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.393404 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca\") pod \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\" (UID: \"af1919c7-cb91-4c84-9bf7-587aa1363fdf\") " Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.394289 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config" (OuterVolumeSpecName: "config") pod "af1919c7-cb91-4c84-9bf7-587aa1363fdf" (UID: "af1919c7-cb91-4c84-9bf7-587aa1363fdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.394323 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "af1919c7-cb91-4c84-9bf7-587aa1363fdf" (UID: "af1919c7-cb91-4c84-9bf7-587aa1363fdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.398733 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af1919c7-cb91-4c84-9bf7-587aa1363fdf" (UID: "af1919c7-cb91-4c84-9bf7-587aa1363fdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.399481 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl" (OuterVolumeSpecName: "kube-api-access-x6gvl") pod "af1919c7-cb91-4c84-9bf7-587aa1363fdf" (UID: "af1919c7-cb91-4c84-9bf7-587aa1363fdf"). InnerVolumeSpecName "kube-api-access-x6gvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.494890 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6gvl\" (UniqueName: \"kubernetes.io/projected/af1919c7-cb91-4c84-9bf7-587aa1363fdf-kube-api-access-x6gvl\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.495180 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1919c7-cb91-4c84-9bf7-587aa1363fdf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.495279 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.495355 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1919c7-cb91-4c84-9bf7-587aa1363fdf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.689155 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:19 crc kubenswrapper[4903]: I1202 23:03:19.693527 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7874b474-b7c2j"] Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.311903 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc"] Dec 02 23:03:20 crc kubenswrapper[4903]: E1202 23:03:20.312121 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" containerName="route-controller-manager" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.312136 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" containerName="route-controller-manager" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.312237 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" containerName="route-controller-manager" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.312580 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.313892 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.314371 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.315382 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.315631 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.315940 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.316374 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.333031 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc"] Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.407335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-client-ca\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.407975 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5sfr\" (UniqueName: \"kubernetes.io/projected/e829d89d-d01d-4569-ba55-98d81ad83f6f-kube-api-access-q5sfr\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.408071 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e829d89d-d01d-4569-ba55-98d81ad83f6f-serving-cert\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.408160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-config\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.509332 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-client-ca\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.509381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5sfr\" (UniqueName: \"kubernetes.io/projected/e829d89d-d01d-4569-ba55-98d81ad83f6f-kube-api-access-q5sfr\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.509410 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e829d89d-d01d-4569-ba55-98d81ad83f6f-serving-cert\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.509434 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-config\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.510270 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-client-ca\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.510573 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e829d89d-d01d-4569-ba55-98d81ad83f6f-config\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.516485 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e829d89d-d01d-4569-ba55-98d81ad83f6f-serving-cert\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.558110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5sfr\" (UniqueName: \"kubernetes.io/projected/e829d89d-d01d-4569-ba55-98d81ad83f6f-kube-api-access-q5sfr\") pod \"route-controller-manager-55fc8d7f85-6hbrc\" (UID: \"e829d89d-d01d-4569-ba55-98d81ad83f6f\") " pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:20 crc kubenswrapper[4903]: I1202 23:03:20.632536 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:21 crc kubenswrapper[4903]: I1202 23:03:21.055762 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc"] Dec 02 23:03:21 crc kubenswrapper[4903]: I1202 23:03:21.375539 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" event={"ID":"e829d89d-d01d-4569-ba55-98d81ad83f6f","Type":"ContainerStarted","Data":"ecd5d255b96d0988fad7ce8b3feaa6d66b1d30466d17dec6a854612698831193"} Dec 02 23:03:21 crc kubenswrapper[4903]: I1202 23:03:21.620429 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1919c7-cb91-4c84-9bf7-587aa1363fdf" path="/var/lib/kubelet/pods/af1919c7-cb91-4c84-9bf7-587aa1363fdf/volumes" Dec 02 23:03:22 crc kubenswrapper[4903]: I1202 23:03:22.382786 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" event={"ID":"e829d89d-d01d-4569-ba55-98d81ad83f6f","Type":"ContainerStarted","Data":"23c5028f2faec93c1aea5ffa45dd56d3edef00e854b425129b3a12eac629897b"} Dec 02 23:03:22 crc kubenswrapper[4903]: I1202 23:03:22.383946 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:22 crc kubenswrapper[4903]: I1202 23:03:22.401813 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" podStartSLOduration=4.401795002 podStartE2EDuration="4.401795002s" podCreationTimestamp="2025-12-02 23:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:03:22.400105047 +0000 UTC m=+341.108659330" watchObservedRunningTime="2025-12-02 23:03:22.401795002 +0000 UTC m=+341.110349285" Dec 02 23:03:22 crc kubenswrapper[4903]: I1202 23:03:22.789203 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55fc8d7f85-6hbrc" Dec 02 23:03:23 crc kubenswrapper[4903]: I1202 23:03:23.325829 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gqq5v" Dec 02 23:03:23 crc kubenswrapper[4903]: I1202 23:03:23.391236 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 23:03:27 crc kubenswrapper[4903]: I1202 23:03:27.305583 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:27 crc kubenswrapper[4903]: I1202 23:03:27.376775 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v87tp" Dec 02 23:03:48 crc kubenswrapper[4903]: I1202 23:03:48.500411 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" containerName="registry" containerID="cri-o://8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2" gracePeriod=30 Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.429463 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.578857 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579091 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579152 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579203 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579286 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579408 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fv9f\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579788 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.579855 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca\") pod \"4652215a-081a-4b64-aa40-1508e18e8a15\" (UID: \"4652215a-081a-4b64-aa40-1508e18e8a15\") " Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580466 4903 generic.go:334] "Generic (PLEG): container finished" podID="4652215a-081a-4b64-aa40-1508e18e8a15" containerID="8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2" exitCode=0 Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580505 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" event={"ID":"4652215a-081a-4b64-aa40-1508e18e8a15","Type":"ContainerDied","Data":"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2"} Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580535 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" event={"ID":"4652215a-081a-4b64-aa40-1508e18e8a15","Type":"ContainerDied","Data":"2fc0487559c839ac21c835518b962a7ccf91a68ab48ddf7555dbd1fbf2ed2b38"} Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580551 4903 scope.go:117] "RemoveContainer" containerID="8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580680 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.580686 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.582799 4903 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.582895 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.588608 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.592148 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.592687 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f" (OuterVolumeSpecName: "kube-api-access-7fv9f") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "kube-api-access-7fv9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.593350 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.599982 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.606500 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4652215a-081a-4b64-aa40-1508e18e8a15" (UID: "4652215a-081a-4b64-aa40-1508e18e8a15"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.660504 4903 scope.go:117] "RemoveContainer" containerID="8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2" Dec 02 23:03:49 crc kubenswrapper[4903]: E1202 23:03:49.660986 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2\": container with ID starting with 8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2 not found: ID does not exist" containerID="8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.661026 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2"} err="failed to get container status \"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2\": rpc error: code = NotFound desc = could not find container \"8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2\": container with ID starting with 8294d481b8a199be3df286d75b01c16995fa1bbfd9c54a79c08fd2e11d39bbf2 not found: ID does not exist" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.683771 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4652215a-081a-4b64-aa40-1508e18e8a15-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.683872 4903 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4652215a-081a-4b64-aa40-1508e18e8a15-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.683934 4903 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4652215a-081a-4b64-aa40-1508e18e8a15-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.683991 4903 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.684044 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.684104 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fv9f\" (UniqueName: \"kubernetes.io/projected/4652215a-081a-4b64-aa40-1508e18e8a15-kube-api-access-7fv9f\") on node \"crc\" DevicePath \"\"" Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.914376 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 23:03:49 crc kubenswrapper[4903]: I1202 23:03:49.917291 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7lfrz"] Dec 02 23:03:51 crc kubenswrapper[4903]: I1202 23:03:51.624704 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" path="/var/lib/kubelet/pods/4652215a-081a-4b64-aa40-1508e18e8a15/volumes" Dec 02 23:03:53 crc kubenswrapper[4903]: I1202 23:03:53.070489 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:03:53 crc kubenswrapper[4903]: I1202 23:03:53.070886 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:03:54 crc kubenswrapper[4903]: I1202 23:03:54.227198 4903 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-7lfrz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.15:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 23:03:54 crc kubenswrapper[4903]: I1202 23:03:54.227324 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-7lfrz" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.15:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 23:04:23 crc kubenswrapper[4903]: I1202 23:04:23.069562 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:04:23 crc kubenswrapper[4903]: I1202 23:04:23.070199 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:04:53 crc kubenswrapper[4903]: I1202 23:04:53.069762 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:04:53 crc kubenswrapper[4903]: I1202 23:04:53.070391 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:04:53 crc kubenswrapper[4903]: I1202 23:04:53.070463 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:04:53 crc kubenswrapper[4903]: I1202 23:04:53.071296 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:04:53 crc kubenswrapper[4903]: I1202 23:04:53.071386 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac" gracePeriod=600 Dec 02 23:04:54 crc kubenswrapper[4903]: I1202 23:04:54.035802 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac" exitCode=0 Dec 02 23:04:54 crc kubenswrapper[4903]: I1202 23:04:54.035903 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac"} Dec 02 23:04:54 crc kubenswrapper[4903]: I1202 23:04:54.036398 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e"} Dec 02 23:04:54 crc kubenswrapper[4903]: I1202 23:04:54.036450 4903 scope.go:117] "RemoveContainer" containerID="717477a2db303e51427b0b5cf0e508fffd6571733a1c2aeba3d6eb4d088ad757" Dec 02 23:06:41 crc kubenswrapper[4903]: I1202 23:06:41.916380 4903 scope.go:117] "RemoveContainer" containerID="7766d2c7408a8279a7ad97f994093dfec89b584f4494f31b02a3e6506cd24492" Dec 02 23:06:53 crc kubenswrapper[4903]: I1202 23:06:53.069868 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:06:53 crc kubenswrapper[4903]: I1202 23:06:53.070681 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:07:23 crc kubenswrapper[4903]: I1202 23:07:23.070699 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:07:23 crc kubenswrapper[4903]: I1202 23:07:23.071629 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:07:41 crc kubenswrapper[4903]: I1202 23:07:41.963128 4903 scope.go:117] "RemoveContainer" containerID="364b068142605da84ac356f836e9be677ceac01cb878267101dcb31ab8628f79" Dec 02 23:07:41 crc kubenswrapper[4903]: I1202 23:07:41.996720 4903 scope.go:117] "RemoveContainer" containerID="5c6380e6097f544dae72a63f7fb447c5ff490ca5605fb4d7e62b20abfcb6fe31" Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.069710 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.070282 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.070348 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.071818 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.071884 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e" gracePeriod=600 Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.471009 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e" exitCode=0 Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.471432 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e"} Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.471479 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522"} Dec 02 23:07:53 crc kubenswrapper[4903]: I1202 23:07:53.471506 4903 scope.go:117] "RemoveContainer" containerID="a023dfa17e08be67b1e706e7db3c07733727a68ce47e866a8587e419596f2cac" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.469370 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qk25x"] Dec 02 23:08:19 crc kubenswrapper[4903]: E1202 23:08:19.471014 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" containerName="registry" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.471081 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" containerName="registry" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.471231 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4652215a-081a-4b64-aa40-1508e18e8a15" containerName="registry" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.471625 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.473285 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.474831 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.475008 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2qb29" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.480404 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-98zmz"] Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.481170 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-98zmz" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.485662 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qk25x"] Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.488876 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4vqrb" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.515617 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-98zmz"] Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.524529 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tq5jc"] Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.526342 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.532343 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lsrct" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.540102 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tq5jc"] Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.581791 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkk2f\" (UniqueName: \"kubernetes.io/projected/245517d6-a256-48d4-8140-bd54f1794279-kube-api-access-xkk2f\") pod \"cert-manager-5b446d88c5-98zmz\" (UID: \"245517d6-a256-48d4-8140-bd54f1794279\") " pod="cert-manager/cert-manager-5b446d88c5-98zmz" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.581849 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj8z\" (UniqueName: \"kubernetes.io/projected/b4744872-fc92-4dc7-b64f-dbdc3c32c890-kube-api-access-9xj8z\") pod \"cert-manager-cainjector-7f985d654d-qk25x\" (UID: \"b4744872-fc92-4dc7-b64f-dbdc3c32c890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.683089 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj8z\" (UniqueName: \"kubernetes.io/projected/b4744872-fc92-4dc7-b64f-dbdc3c32c890-kube-api-access-9xj8z\") pod \"cert-manager-cainjector-7f985d654d-qk25x\" (UID: \"b4744872-fc92-4dc7-b64f-dbdc3c32c890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.683195 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkk2f\" (UniqueName: \"kubernetes.io/projected/245517d6-a256-48d4-8140-bd54f1794279-kube-api-access-xkk2f\") pod \"cert-manager-5b446d88c5-98zmz\" (UID: \"245517d6-a256-48d4-8140-bd54f1794279\") " pod="cert-manager/cert-manager-5b446d88c5-98zmz" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.683231 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9q4\" (UniqueName: \"kubernetes.io/projected/62990e39-3700-4b20-9668-d90e0074a402-kube-api-access-cp9q4\") pod \"cert-manager-webhook-5655c58dd6-tq5jc\" (UID: \"62990e39-3700-4b20-9668-d90e0074a402\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.704007 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj8z\" (UniqueName: \"kubernetes.io/projected/b4744872-fc92-4dc7-b64f-dbdc3c32c890-kube-api-access-9xj8z\") pod \"cert-manager-cainjector-7f985d654d-qk25x\" (UID: \"b4744872-fc92-4dc7-b64f-dbdc3c32c890\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.704048 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkk2f\" (UniqueName: \"kubernetes.io/projected/245517d6-a256-48d4-8140-bd54f1794279-kube-api-access-xkk2f\") pod \"cert-manager-5b446d88c5-98zmz\" (UID: \"245517d6-a256-48d4-8140-bd54f1794279\") " pod="cert-manager/cert-manager-5b446d88c5-98zmz" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.784768 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9q4\" (UniqueName: \"kubernetes.io/projected/62990e39-3700-4b20-9668-d90e0074a402-kube-api-access-cp9q4\") pod \"cert-manager-webhook-5655c58dd6-tq5jc\" (UID: \"62990e39-3700-4b20-9668-d90e0074a402\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.802003 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.808483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9q4\" (UniqueName: \"kubernetes.io/projected/62990e39-3700-4b20-9668-d90e0074a402-kube-api-access-cp9q4\") pod \"cert-manager-webhook-5655c58dd6-tq5jc\" (UID: \"62990e39-3700-4b20-9668-d90e0074a402\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.810411 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-98zmz" Dec 02 23:08:19 crc kubenswrapper[4903]: I1202 23:08:19.844286 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.064025 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qk25x"] Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.077760 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.326462 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-98zmz"] Dec 02 23:08:20 crc kubenswrapper[4903]: W1202 23:08:20.337262 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod245517d6_a256_48d4_8140_bd54f1794279.slice/crio-3da6ecf3040c7a274c4da1252e3e456327f6a98d43ad8534578b084bb581a5f9 WatchSource:0}: Error finding container 3da6ecf3040c7a274c4da1252e3e456327f6a98d43ad8534578b084bb581a5f9: Status 404 returned error can't find the container with id 3da6ecf3040c7a274c4da1252e3e456327f6a98d43ad8534578b084bb581a5f9 Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.345354 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tq5jc"] Dec 02 23:08:20 crc kubenswrapper[4903]: W1202 23:08:20.349250 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62990e39_3700_4b20_9668_d90e0074a402.slice/crio-f0824cfa499f7d8a28c68acc936969b324c0bc8493458859dc9480813615e59b WatchSource:0}: Error finding container f0824cfa499f7d8a28c68acc936969b324c0bc8493458859dc9480813615e59b: Status 404 returned error can't find the container with id f0824cfa499f7d8a28c68acc936969b324c0bc8493458859dc9480813615e59b Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.647007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" event={"ID":"62990e39-3700-4b20-9668-d90e0074a402","Type":"ContainerStarted","Data":"f0824cfa499f7d8a28c68acc936969b324c0bc8493458859dc9480813615e59b"} Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.649245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-98zmz" event={"ID":"245517d6-a256-48d4-8140-bd54f1794279","Type":"ContainerStarted","Data":"3da6ecf3040c7a274c4da1252e3e456327f6a98d43ad8534578b084bb581a5f9"} Dec 02 23:08:20 crc kubenswrapper[4903]: I1202 23:08:20.651946 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" event={"ID":"b4744872-fc92-4dc7-b64f-dbdc3c32c890","Type":"ContainerStarted","Data":"292dca16b526799ca04673e073bd978a29543083a820f3ed12b510712d288cdd"} Dec 02 23:08:22 crc kubenswrapper[4903]: I1202 23:08:22.665349 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" event={"ID":"b4744872-fc92-4dc7-b64f-dbdc3c32c890","Type":"ContainerStarted","Data":"7053a29fad3516ab78e112877497643a7abe95c5152c7fbedd97ae9415fddfe2"} Dec 02 23:08:22 crc kubenswrapper[4903]: I1202 23:08:22.687899 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qk25x" podStartSLOduration=1.6747047350000002 podStartE2EDuration="3.68788411s" podCreationTimestamp="2025-12-02 23:08:19 +0000 UTC" firstStartedPulling="2025-12-02 23:08:20.077568691 +0000 UTC m=+638.786122974" lastFinishedPulling="2025-12-02 23:08:22.090748056 +0000 UTC m=+640.799302349" observedRunningTime="2025-12-02 23:08:22.687177585 +0000 UTC m=+641.395731878" watchObservedRunningTime="2025-12-02 23:08:22.68788411 +0000 UTC m=+641.396438393" Dec 02 23:08:24 crc kubenswrapper[4903]: I1202 23:08:24.689985 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" event={"ID":"62990e39-3700-4b20-9668-d90e0074a402","Type":"ContainerStarted","Data":"61aa17e223afa6b64c83ec68e775a974fe284ae414e00bf32019aa8b3545f479"} Dec 02 23:08:24 crc kubenswrapper[4903]: I1202 23:08:24.690240 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:24 crc kubenswrapper[4903]: I1202 23:08:24.691956 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-98zmz" event={"ID":"245517d6-a256-48d4-8140-bd54f1794279","Type":"ContainerStarted","Data":"3506ca66e652dd6ffacf89c1fe022b34326eade0befe50500759daa8a0a2a3ec"} Dec 02 23:08:24 crc kubenswrapper[4903]: I1202 23:08:24.718162 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" podStartSLOduration=2.419668224 podStartE2EDuration="5.718135353s" podCreationTimestamp="2025-12-02 23:08:19 +0000 UTC" firstStartedPulling="2025-12-02 23:08:20.353142362 +0000 UTC m=+639.061696685" lastFinishedPulling="2025-12-02 23:08:23.651609491 +0000 UTC m=+642.360163814" observedRunningTime="2025-12-02 23:08:24.712375872 +0000 UTC m=+643.420930195" watchObservedRunningTime="2025-12-02 23:08:24.718135353 +0000 UTC m=+643.426689676" Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.799601 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-98zmz" podStartSLOduration=7.4157406439999995 podStartE2EDuration="10.799576129s" podCreationTimestamp="2025-12-02 23:08:19 +0000 UTC" firstStartedPulling="2025-12-02 23:08:20.342261455 +0000 UTC m=+639.050815738" lastFinishedPulling="2025-12-02 23:08:23.72609693 +0000 UTC m=+642.434651223" observedRunningTime="2025-12-02 23:08:24.735564629 +0000 UTC m=+643.444118952" watchObservedRunningTime="2025-12-02 23:08:29.799576129 +0000 UTC m=+648.508130422" Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803168 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz9ff"] Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803812 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-controller" containerID="cri-o://ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803892 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="nbdb" containerID="cri-o://27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803965 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="sbdb" containerID="cri-o://e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803985 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-acl-logging" containerID="cri-o://fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.804057 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="northd" containerID="cri-o://6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.803970 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.804089 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-node" containerID="cri-o://88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.845059 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" containerID="cri-o://54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" gracePeriod=30 Dec 02 23:08:29 crc kubenswrapper[4903]: I1202 23:08:29.848441 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-tq5jc" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.139162 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/3.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.142804 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovn-acl-logging/0.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.143709 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovn-controller/0.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.144451 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215146 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zqqrm"] Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215464 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215496 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215511 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215523 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215544 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kubecfg-setup" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215558 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kubecfg-setup" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215573 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-node" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215585 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-node" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215601 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="northd" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215613 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="northd" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215630 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215642 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215682 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="nbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215694 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="nbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215717 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="sbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215730 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="sbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215749 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215761 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215778 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-acl-logging" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215790 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-acl-logging" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.215810 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215822 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.215985 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216002 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216018 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216033 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="sbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216366 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216392 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-node" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216415 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="nbdb" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216434 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216450 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovn-acl-logging" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216467 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216484 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="northd" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.216712 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216729 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.216742 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.216755 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.217030 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerName="ovnkube-controller" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.220071 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237390 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237439 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237472 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8lp5\" (UniqueName: \"kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237518 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237555 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237577 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237613 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237638 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237677 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237704 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237667 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237698 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237715 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237729 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237748 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237754 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237825 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237874 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash" (OuterVolumeSpecName: "host-slash") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237903 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237923 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237946 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237922 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237950 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log" (OuterVolumeSpecName: "node-log") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.237975 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238045 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238061 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238056 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238109 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket\") pod \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\" (UID: \"99ab90b8-4bb9-418c-8b55-19c4c10edec7\") " Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238297 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket" (OuterVolumeSpecName: "log-socket") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238459 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238555 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238718 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238785 4903 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238801 4903 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238815 4903 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238834 4903 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238845 4903 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238858 4903 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238868 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238878 4903 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238888 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238899 4903 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238911 4903 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238923 4903 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238934 4903 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238944 4903 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238955 4903 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.238967 4903 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.245552 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5" (OuterVolumeSpecName: "kube-api-access-l8lp5") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "kube-api-access-l8lp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.246894 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.265644 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "99ab90b8-4bb9-418c-8b55-19c4c10edec7" (UID: "99ab90b8-4bb9-418c-8b55-19c4c10edec7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.339936 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-config\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.340249 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-var-lib-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.340342 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-etc-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.340437 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-bin\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.340630 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64zm\" (UniqueName: \"kubernetes.io/projected/9bccef11-6d13-4bd4-a71b-1920c8745ee2-kube-api-access-v64zm\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.340840 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovn-node-metrics-cert\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-netns\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341155 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-systemd-units\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-ovn\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-systemd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341522 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341615 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-netd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341711 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-script-lib\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341789 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-log-socket\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341826 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.341911 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342003 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-env-overrides\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342109 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-node-log\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342196 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-kubelet\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342244 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-slash\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342362 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/99ab90b8-4bb9-418c-8b55-19c4c10edec7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342408 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/99ab90b8-4bb9-418c-8b55-19c4c10edec7-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342438 4903 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/99ab90b8-4bb9-418c-8b55-19c4c10edec7-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.342463 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8lp5\" (UniqueName: \"kubernetes.io/projected/99ab90b8-4bb9-418c-8b55-19c4c10edec7-kube-api-access-l8lp5\") on node \"crc\" DevicePath \"\"" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444058 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64zm\" (UniqueName: \"kubernetes.io/projected/9bccef11-6d13-4bd4-a71b-1920c8745ee2-kube-api-access-v64zm\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovn-node-metrics-cert\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444231 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-netns\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444263 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-systemd-units\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444324 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-ovn\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444464 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-systemd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444510 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444558 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-netd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444620 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-script-lib\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444688 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-log-socket\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444724 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444759 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-node-log\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444791 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444821 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-env-overrides\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444858 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-kubelet\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444889 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-slash\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444899 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.444924 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-config\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445050 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-var-lib-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445176 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-etc-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445234 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-bin\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445375 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-bin\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445577 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-var-lib-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445642 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-systemd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445699 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-netns\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445752 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-node-log\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-systemd-units\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445881 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-log-socket\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-cni-netd\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445924 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.445758 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-etc-openvswitch\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-kubelet\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446062 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-slash\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-config\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446435 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bccef11-6d13-4bd4-a71b-1920c8745ee2-run-ovn\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446487 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovnkube-script-lib\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.446621 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bccef11-6d13-4bd4-a71b-1920c8745ee2-env-overrides\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.450696 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bccef11-6d13-4bd4-a71b-1920c8745ee2-ovn-node-metrics-cert\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.472985 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64zm\" (UniqueName: \"kubernetes.io/projected/9bccef11-6d13-4bd4-a71b-1920c8745ee2-kube-api-access-v64zm\") pod \"ovnkube-node-zqqrm\" (UID: \"9bccef11-6d13-4bd4-a71b-1920c8745ee2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.550861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.743904 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovnkube-controller/3.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.767853 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovn-acl-logging/0.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.770561 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pz9ff_99ab90b8-4bb9-418c-8b55-19c4c10edec7/ovn-controller/0.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771289 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771320 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771335 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771352 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771368 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771381 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" exitCode=0 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771396 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" exitCode=143 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771410 4903 generic.go:334] "Generic (PLEG): container finished" podID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" exitCode=143 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771442 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771486 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771550 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771570 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771589 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771610 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771637 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771695 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771704 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771833 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771849 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771859 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771868 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771877 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771886 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771894 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771909 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771925 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771933 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771942 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771959 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771968 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771977 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771985 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.771994 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772003 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772011 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772024 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772039 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772048 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772055 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772063 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772073 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772082 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772091 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772102 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772111 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772119 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772132 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pz9ff" event={"ID":"99ab90b8-4bb9-418c-8b55-19c4c10edec7","Type":"ContainerDied","Data":"2e8025f3ad329533b26e7396411910b27aa262df5ca5ebfc9375e593401aa1d4"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772146 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772155 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772165 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772174 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772183 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772193 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772202 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772210 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772219 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.772227 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.777508 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/2.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.778314 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/1.log" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.778367 4903 generic.go:334] "Generic (PLEG): container finished" podID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" containerID="96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267" exitCode=2 Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.778439 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerDied","Data":"96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.778503 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.779141 4903 scope.go:117] "RemoveContainer" containerID="96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267" Dec 02 23:08:30 crc kubenswrapper[4903]: E1202 23:08:30.779517 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s4nbg_openshift-multus(a689512c-b6fd-4ffe-af54-dbb8f45ab9e5)\"" pod="openshift-multus/multus-s4nbg" podUID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.783539 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"37ff45244c36a156375b12b01108506138ed80dac91af505d6d2871c9f16c8bf"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.783596 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"8c9226d0febfdfb4c7147d2b23ce6ccc5aedbd46d6a40e85314ff345845b2fc6"} Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.824322 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.830249 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz9ff"] Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.845249 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pz9ff"] Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.849796 4903 scope.go:117] "RemoveContainer" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.883703 4903 scope.go:117] "RemoveContainer" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.908418 4903 scope.go:117] "RemoveContainer" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.953089 4903 scope.go:117] "RemoveContainer" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.968677 4903 scope.go:117] "RemoveContainer" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:30 crc kubenswrapper[4903]: I1202 23:08:30.985728 4903 scope.go:117] "RemoveContainer" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.003176 4903 scope.go:117] "RemoveContainer" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.039042 4903 scope.go:117] "RemoveContainer" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.058465 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.058777 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.058817 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} err="failed to get container status \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.058843 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.059154 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": container with ID starting with 4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858 not found: ID does not exist" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059184 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} err="failed to get container status \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": rpc error: code = NotFound desc = could not find container \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": container with ID starting with 4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059203 4903 scope.go:117] "RemoveContainer" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.059547 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": container with ID starting with e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f not found: ID does not exist" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059578 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} err="failed to get container status \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": rpc error: code = NotFound desc = could not find container \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": container with ID starting with e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059597 4903 scope.go:117] "RemoveContainer" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.059905 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": container with ID starting with 27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70 not found: ID does not exist" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059933 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} err="failed to get container status \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": rpc error: code = NotFound desc = could not find container \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": container with ID starting with 27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.059950 4903 scope.go:117] "RemoveContainer" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.060206 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": container with ID starting with 6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402 not found: ID does not exist" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060231 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} err="failed to get container status \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": rpc error: code = NotFound desc = could not find container \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": container with ID starting with 6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060247 4903 scope.go:117] "RemoveContainer" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.060463 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": container with ID starting with f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a not found: ID does not exist" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060519 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} err="failed to get container status \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": rpc error: code = NotFound desc = could not find container \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": container with ID starting with f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060537 4903 scope.go:117] "RemoveContainer" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.060815 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": container with ID starting with 88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648 not found: ID does not exist" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060858 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} err="failed to get container status \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": rpc error: code = NotFound desc = could not find container \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": container with ID starting with 88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.060876 4903 scope.go:117] "RemoveContainer" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.061180 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": container with ID starting with fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b not found: ID does not exist" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061203 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} err="failed to get container status \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": rpc error: code = NotFound desc = could not find container \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": container with ID starting with fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061223 4903 scope.go:117] "RemoveContainer" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.061484 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": container with ID starting with ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f not found: ID does not exist" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061515 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} err="failed to get container status \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": rpc error: code = NotFound desc = could not find container \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": container with ID starting with ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061565 4903 scope.go:117] "RemoveContainer" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: E1202 23:08:31.061856 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": container with ID starting with 94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78 not found: ID does not exist" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061903 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} err="failed to get container status \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": rpc error: code = NotFound desc = could not find container \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": container with ID starting with 94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.061934 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062253 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} err="failed to get container status \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062276 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062480 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} err="failed to get container status \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": rpc error: code = NotFound desc = could not find container \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": container with ID starting with 4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062508 4903 scope.go:117] "RemoveContainer" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062749 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} err="failed to get container status \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": rpc error: code = NotFound desc = could not find container \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": container with ID starting with e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.062830 4903 scope.go:117] "RemoveContainer" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063068 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} err="failed to get container status \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": rpc error: code = NotFound desc = could not find container \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": container with ID starting with 27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063107 4903 scope.go:117] "RemoveContainer" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063292 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} err="failed to get container status \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": rpc error: code = NotFound desc = could not find container \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": container with ID starting with 6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063314 4903 scope.go:117] "RemoveContainer" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063517 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} err="failed to get container status \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": rpc error: code = NotFound desc = could not find container \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": container with ID starting with f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063541 4903 scope.go:117] "RemoveContainer" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063733 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} err="failed to get container status \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": rpc error: code = NotFound desc = could not find container \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": container with ID starting with 88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063750 4903 scope.go:117] "RemoveContainer" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063932 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} err="failed to get container status \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": rpc error: code = NotFound desc = could not find container \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": container with ID starting with fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.063949 4903 scope.go:117] "RemoveContainer" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.064131 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} err="failed to get container status \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": rpc error: code = NotFound desc = could not find container \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": container with ID starting with ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.064148 4903 scope.go:117] "RemoveContainer" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065035 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} err="failed to get container status \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": rpc error: code = NotFound desc = could not find container \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": container with ID starting with 94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065083 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065267 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} err="failed to get container status \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065284 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065488 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} err="failed to get container status \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": rpc error: code = NotFound desc = could not find container \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": container with ID starting with 4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065506 4903 scope.go:117] "RemoveContainer" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065737 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} err="failed to get container status \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": rpc error: code = NotFound desc = could not find container \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": container with ID starting with e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065757 4903 scope.go:117] "RemoveContainer" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065935 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} err="failed to get container status \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": rpc error: code = NotFound desc = could not find container \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": container with ID starting with 27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.065952 4903 scope.go:117] "RemoveContainer" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066131 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} err="failed to get container status \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": rpc error: code = NotFound desc = could not find container \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": container with ID starting with 6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066155 4903 scope.go:117] "RemoveContainer" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066318 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} err="failed to get container status \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": rpc error: code = NotFound desc = could not find container \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": container with ID starting with f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066343 4903 scope.go:117] "RemoveContainer" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066544 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} err="failed to get container status \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": rpc error: code = NotFound desc = could not find container \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": container with ID starting with 88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066568 4903 scope.go:117] "RemoveContainer" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066801 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} err="failed to get container status \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": rpc error: code = NotFound desc = could not find container \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": container with ID starting with fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.066838 4903 scope.go:117] "RemoveContainer" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067074 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} err="failed to get container status \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": rpc error: code = NotFound desc = could not find container \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": container with ID starting with ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067109 4903 scope.go:117] "RemoveContainer" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067305 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} err="failed to get container status \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": rpc error: code = NotFound desc = could not find container \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": container with ID starting with 94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067328 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067550 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} err="failed to get container status \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067586 4903 scope.go:117] "RemoveContainer" containerID="4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067832 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858"} err="failed to get container status \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": rpc error: code = NotFound desc = could not find container \"4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858\": container with ID starting with 4a714b8675c95dcc7244a032c26beb2ff6a53022b19005aee6b56bf770989858 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.067871 4903 scope.go:117] "RemoveContainer" containerID="e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068041 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f"} err="failed to get container status \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": rpc error: code = NotFound desc = could not find container \"e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f\": container with ID starting with e112a6981ff763ca707261b0f727f5c9c2f80529cdba3783d1ef290d2366530f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068064 4903 scope.go:117] "RemoveContainer" containerID="27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068228 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70"} err="failed to get container status \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": rpc error: code = NotFound desc = could not find container \"27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70\": container with ID starting with 27a6abfd1d8c426294bd0615eea39d24ccfa35689b8b2e51359dc6be8b033d70 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068245 4903 scope.go:117] "RemoveContainer" containerID="6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068434 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402"} err="failed to get container status \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": rpc error: code = NotFound desc = could not find container \"6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402\": container with ID starting with 6b350ca0ab56787f0555c59abd77b389685fc819c4192742ee4cbb068a585402 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068458 4903 scope.go:117] "RemoveContainer" containerID="f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068735 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a"} err="failed to get container status \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": rpc error: code = NotFound desc = could not find container \"f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a\": container with ID starting with f8564338024c3c75b30f8044348d7e3e795afe4f3a0d0a095acaed4a86c8164a not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.068775 4903 scope.go:117] "RemoveContainer" containerID="88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069000 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648"} err="failed to get container status \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": rpc error: code = NotFound desc = could not find container \"88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648\": container with ID starting with 88cfe63b3e48b7620c00f2363ac75dd30958ea7f8510cf4933f20bc8e307e648 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069036 4903 scope.go:117] "RemoveContainer" containerID="fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069286 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b"} err="failed to get container status \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": rpc error: code = NotFound desc = could not find container \"fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b\": container with ID starting with fee6747aa83ff419fbd37f4818db203363a192084baeab3f46a4aea401998d1b not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069330 4903 scope.go:117] "RemoveContainer" containerID="ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069620 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f"} err="failed to get container status \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": rpc error: code = NotFound desc = could not find container \"ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f\": container with ID starting with ef67ca5903d6d35880613380dff6c040a5faca1a1d3123b1b67b299b89b0c37f not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.069676 4903 scope.go:117] "RemoveContainer" containerID="94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.070084 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78"} err="failed to get container status \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": rpc error: code = NotFound desc = could not find container \"94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78\": container with ID starting with 94ee6ef6517269d3523f42446d5f74ad35e34254cade4f77356dd8f6a9f05e78 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.070123 4903 scope.go:117] "RemoveContainer" containerID="54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.070318 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318"} err="failed to get container status \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": rpc error: code = NotFound desc = could not find container \"54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318\": container with ID starting with 54d31014c7c8946fbf989d0cd38305c7fec4b3660d701d89c831eb6602fb4318 not found: ID does not exist" Dec 02 23:08:31 crc kubenswrapper[4903]: I1202 23:08:31.624086 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ab90b8-4bb9-418c-8b55-19c4c10edec7" path="/var/lib/kubelet/pods/99ab90b8-4bb9-418c-8b55-19c4c10edec7/volumes" Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662056 4903 generic.go:334] "Generic (PLEG): container finished" podID="9bccef11-6d13-4bd4-a71b-1920c8745ee2" containerID="37ff45244c36a156375b12b01108506138ed80dac91af505d6d2871c9f16c8bf" exitCode=0 Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662122 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerDied","Data":"37ff45244c36a156375b12b01108506138ed80dac91af505d6d2871c9f16c8bf"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662195 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"617555b5609b2be03b054c86364526f686e06a8176287e59d78d9104b8e83c99"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662216 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"034d552a6b34a124a0515cfdb6a0e2019ef8c7c9db2e6f42296d5923b69a1332"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"7d21a7db1e0d4d668d633b4663a6411b648cdebe7e207f96e99655e006fc1d5a"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662256 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"04eb6f24de5ad17fa08537260f3a552959bb6c8249f4764c8093e901141c3b70"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662285 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"325d4fdcc2c8b89a8032f749f6eee565f47f60ebc0e59409c657e954c79caca8"} Dec 02 23:08:32 crc kubenswrapper[4903]: I1202 23:08:32.662302 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"010e7e1911c8770c8c8a7e7cf0b19ff875984a9812818c432d72d112f10e2f4e"} Dec 02 23:08:34 crc kubenswrapper[4903]: I1202 23:08:34.681363 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"6e4e49ff138398696c469ecc89638e6b3f922dc5866b59abfa99ae0d88274057"} Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.699967 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" event={"ID":"9bccef11-6d13-4bd4-a71b-1920c8745ee2","Type":"ContainerStarted","Data":"33ebce2f7047e2e8ad4095abb2bbb8221370fa7e623590355f276613fdf50478"} Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.700467 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.700497 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.700507 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.733160 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" podStartSLOduration=6.733144934 podStartE2EDuration="6.733144934s" podCreationTimestamp="2025-12-02 23:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:08:36.729718124 +0000 UTC m=+655.438272417" watchObservedRunningTime="2025-12-02 23:08:36.733144934 +0000 UTC m=+655.441699217" Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.736952 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:36 crc kubenswrapper[4903]: I1202 23:08:36.741338 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:08:42 crc kubenswrapper[4903]: I1202 23:08:42.051740 4903 scope.go:117] "RemoveContainer" containerID="940f76af114eeec075ebe1a320bc817b99c5b9f687325fa076bd2fffa0291c50" Dec 02 23:08:42 crc kubenswrapper[4903]: I1202 23:08:42.612998 4903 scope.go:117] "RemoveContainer" containerID="96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267" Dec 02 23:08:42 crc kubenswrapper[4903]: E1202 23:08:42.613544 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s4nbg_openshift-multus(a689512c-b6fd-4ffe-af54-dbb8f45ab9e5)\"" pod="openshift-multus/multus-s4nbg" podUID="a689512c-b6fd-4ffe-af54-dbb8f45ab9e5" Dec 02 23:08:42 crc kubenswrapper[4903]: I1202 23:08:42.738409 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/2.log" Dec 02 23:08:55 crc kubenswrapper[4903]: I1202 23:08:55.612401 4903 scope.go:117] "RemoveContainer" containerID="96b14dfaa362061cf0dca6828c4c6b34d8f09e071f419d628af8ccd6eb23c267" Dec 02 23:08:56 crc kubenswrapper[4903]: I1202 23:08:56.838308 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4nbg_a689512c-b6fd-4ffe-af54-dbb8f45ab9e5/kube-multus/2.log" Dec 02 23:08:56 crc kubenswrapper[4903]: I1202 23:08:56.838629 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4nbg" event={"ID":"a689512c-b6fd-4ffe-af54-dbb8f45ab9e5","Type":"ContainerStarted","Data":"5459c7bd142e21b4d8e8013c64c3325e4513aebfc643a4e3c0a516f6fa2b2be7"} Dec 02 23:09:00 crc kubenswrapper[4903]: I1202 23:09:00.587518 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zqqrm" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.171646 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl"] Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.174195 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.177612 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.182569 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl"] Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.277136 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.277182 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.277273 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5p8m\" (UniqueName: \"kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.378745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5p8m\" (UniqueName: \"kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.378853 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.378893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.380232 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.380287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.410573 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5p8m\" (UniqueName: \"kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.493241 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.699668 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl"] Dec 02 23:09:02 crc kubenswrapper[4903]: I1202 23:09:02.870682 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" event={"ID":"bd27baf0-b0a0-4ffd-a85f-0557c98a4996","Type":"ContainerStarted","Data":"10f337a99f055cc57aa0d656ca39b2756c927f1da91e1fef925c68510aae481f"} Dec 02 23:09:04 crc kubenswrapper[4903]: I1202 23:09:04.890080 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerID="9a0be7406cfc96695247033f246227a80f6887b0ede69ff3aa4248c9498a0cf8" exitCode=0 Dec 02 23:09:04 crc kubenswrapper[4903]: I1202 23:09:04.890163 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" event={"ID":"bd27baf0-b0a0-4ffd-a85f-0557c98a4996","Type":"ContainerDied","Data":"9a0be7406cfc96695247033f246227a80f6887b0ede69ff3aa4248c9498a0cf8"} Dec 02 23:09:06 crc kubenswrapper[4903]: I1202 23:09:06.905862 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerID="fd2bd92b9b31d5310e049e75ae08c215462bf0bbda413cfe8a2b8742b902de4b" exitCode=0 Dec 02 23:09:06 crc kubenswrapper[4903]: I1202 23:09:06.905963 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" event={"ID":"bd27baf0-b0a0-4ffd-a85f-0557c98a4996","Type":"ContainerDied","Data":"fd2bd92b9b31d5310e049e75ae08c215462bf0bbda413cfe8a2b8742b902de4b"} Dec 02 23:09:07 crc kubenswrapper[4903]: I1202 23:09:07.928447 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerID="10b1f8cb5fc0aecb20198ac51901ed5bfb919d3640ca63101515c822f5f5cd6d" exitCode=0 Dec 02 23:09:07 crc kubenswrapper[4903]: I1202 23:09:07.928528 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" event={"ID":"bd27baf0-b0a0-4ffd-a85f-0557c98a4996","Type":"ContainerDied","Data":"10b1f8cb5fc0aecb20198ac51901ed5bfb919d3640ca63101515c822f5f5cd6d"} Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.267815 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.283869 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5p8m\" (UniqueName: \"kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m\") pod \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.285852 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle\") pod \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.285954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util\") pod \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\" (UID: \"bd27baf0-b0a0-4ffd-a85f-0557c98a4996\") " Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.291215 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle" (OuterVolumeSpecName: "bundle") pod "bd27baf0-b0a0-4ffd-a85f-0557c98a4996" (UID: "bd27baf0-b0a0-4ffd-a85f-0557c98a4996"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.305628 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m" (OuterVolumeSpecName: "kube-api-access-t5p8m") pod "bd27baf0-b0a0-4ffd-a85f-0557c98a4996" (UID: "bd27baf0-b0a0-4ffd-a85f-0557c98a4996"). InnerVolumeSpecName "kube-api-access-t5p8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.319893 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util" (OuterVolumeSpecName: "util") pod "bd27baf0-b0a0-4ffd-a85f-0557c98a4996" (UID: "bd27baf0-b0a0-4ffd-a85f-0557c98a4996"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.387605 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-util\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.387644 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5p8m\" (UniqueName: \"kubernetes.io/projected/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-kube-api-access-t5p8m\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.387687 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd27baf0-b0a0-4ffd-a85f-0557c98a4996-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.943167 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" event={"ID":"bd27baf0-b0a0-4ffd-a85f-0557c98a4996","Type":"ContainerDied","Data":"10f337a99f055cc57aa0d656ca39b2756c927f1da91e1fef925c68510aae481f"} Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.943223 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f337a99f055cc57aa0d656ca39b2756c927f1da91e1fef925c68510aae481f" Dec 02 23:09:09 crc kubenswrapper[4903]: I1202 23:09:09.943228 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.786825 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc"] Dec 02 23:09:19 crc kubenswrapper[4903]: E1202 23:09:19.787515 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="util" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.787528 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="util" Dec 02 23:09:19 crc kubenswrapper[4903]: E1202 23:09:19.787540 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="pull" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.787545 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="pull" Dec 02 23:09:19 crc kubenswrapper[4903]: E1202 23:09:19.787563 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="extract" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.787569 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="extract" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.787673 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27baf0-b0a0-4ffd-a85f-0557c98a4996" containerName="extract" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.788070 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.789457 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-dcrbq" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.790070 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.790507 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.800565 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc"] Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.817416 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkw9c\" (UniqueName: \"kubernetes.io/projected/7dec0455-1e61-4cbc-893d-600ca1526f90-kube-api-access-fkw9c\") pod \"obo-prometheus-operator-668cf9dfbb-bk4xc\" (UID: \"7dec0455-1e61-4cbc-893d-600ca1526f90\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.915556 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp"] Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.916193 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.918344 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkw9c\" (UniqueName: \"kubernetes.io/projected/7dec0455-1e61-4cbc-893d-600ca1526f90-kube-api-access-fkw9c\") pod \"obo-prometheus-operator-668cf9dfbb-bk4xc\" (UID: \"7dec0455-1e61-4cbc-893d-600ca1526f90\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.919376 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.919578 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4lddw" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.927484 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v"] Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.928274 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.930869 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp"] Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.955353 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkw9c\" (UniqueName: \"kubernetes.io/projected/7dec0455-1e61-4cbc-893d-600ca1526f90-kube-api-access-fkw9c\") pod \"obo-prometheus-operator-668cf9dfbb-bk4xc\" (UID: \"7dec0455-1e61-4cbc-893d-600ca1526f90\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" Dec 02 23:09:19 crc kubenswrapper[4903]: I1202 23:09:19.956078 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v"] Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.019454 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.019519 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.019537 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.019556 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.106397 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.120874 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.120925 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.120945 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.120969 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.125810 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.125827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.126143 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9de21c8e-1da1-4105-83c9-c3a0d3fef062-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-trl6v\" (UID: \"9de21c8e-1da1-4105-83c9-c3a0d3fef062\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.132061 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03107dd2-f5e5-4314-87ff-89c1f03811b2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d68788f74-2nxbp\" (UID: \"03107dd2-f5e5-4314-87ff-89c1f03811b2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.147603 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-4cm4x"] Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.148299 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.152478 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zkghh" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.152605 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.166590 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-4cm4x"] Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.222844 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrjs\" (UniqueName: \"kubernetes.io/projected/6cc0aefd-91b2-432d-8564-ab955a89620a-kube-api-access-jjrjs\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.223124 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cc0aefd-91b2-432d-8564-ab955a89620a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.241945 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.254045 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.321868 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-4rrnv"] Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.322452 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.323399 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjrjs\" (UniqueName: \"kubernetes.io/projected/6cc0aefd-91b2-432d-8564-ab955a89620a-kube-api-access-jjrjs\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.323433 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cc0aefd-91b2-432d-8564-ab955a89620a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.323453 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1730406-adf5-4f90-badf-6f40bec034eb-openshift-service-ca\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.323495 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/c1730406-adf5-4f90-badf-6f40bec034eb-kube-api-access-ffmf9\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.324985 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-ftjv4" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.332423 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cc0aefd-91b2-432d-8564-ab955a89620a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.349864 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjrjs\" (UniqueName: \"kubernetes.io/projected/6cc0aefd-91b2-432d-8564-ab955a89620a-kube-api-access-jjrjs\") pod \"observability-operator-d8bb48f5d-4cm4x\" (UID: \"6cc0aefd-91b2-432d-8564-ab955a89620a\") " pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.367792 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-4rrnv"] Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.428320 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1730406-adf5-4f90-badf-6f40bec034eb-openshift-service-ca\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.428374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/c1730406-adf5-4f90-badf-6f40bec034eb-kube-api-access-ffmf9\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.429884 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1730406-adf5-4f90-badf-6f40bec034eb-openshift-service-ca\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.446363 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmf9\" (UniqueName: \"kubernetes.io/projected/c1730406-adf5-4f90-badf-6f40bec034eb-kube-api-access-ffmf9\") pod \"perses-operator-5446b9c989-4rrnv\" (UID: \"c1730406-adf5-4f90-badf-6f40bec034eb\") " pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.472972 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.515067 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v"] Dec 02 23:09:20 crc kubenswrapper[4903]: W1202 23:09:20.528280 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de21c8e_1da1_4105_83c9_c3a0d3fef062.slice/crio-f08c0ee6c99549ccf06f51661b6486a0561b9408d7be8a2fe4d8313f62ea5405 WatchSource:0}: Error finding container f08c0ee6c99549ccf06f51661b6486a0561b9408d7be8a2fe4d8313f62ea5405: Status 404 returned error can't find the container with id f08c0ee6c99549ccf06f51661b6486a0561b9408d7be8a2fe4d8313f62ea5405 Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.555377 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc"] Dec 02 23:09:20 crc kubenswrapper[4903]: W1202 23:09:20.564822 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dec0455_1e61_4cbc_893d_600ca1526f90.slice/crio-8ad4d48b04821631cdaca50906f3c65686cf89c790273cc4c9034be3523f7597 WatchSource:0}: Error finding container 8ad4d48b04821631cdaca50906f3c65686cf89c790273cc4c9034be3523f7597: Status 404 returned error can't find the container with id 8ad4d48b04821631cdaca50906f3c65686cf89c790273cc4c9034be3523f7597 Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.576306 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp"] Dec 02 23:09:20 crc kubenswrapper[4903]: W1202 23:09:20.604066 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03107dd2_f5e5_4314_87ff_89c1f03811b2.slice/crio-c1936d89839e3976ce7b13324d929c6d61b77a1d08c00f96f853e5fb38ddd207 WatchSource:0}: Error finding container c1936d89839e3976ce7b13324d929c6d61b77a1d08c00f96f853e5fb38ddd207: Status 404 returned error can't find the container with id c1936d89839e3976ce7b13324d929c6d61b77a1d08c00f96f853e5fb38ddd207 Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.668488 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.733670 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-4cm4x"] Dec 02 23:09:20 crc kubenswrapper[4903]: W1202 23:09:20.764390 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc0aefd_91b2_432d_8564_ab955a89620a.slice/crio-8140f5dea3a9db248b4476feccdab21d6b05a4cd08f791df3efdc8f2f0440d15 WatchSource:0}: Error finding container 8140f5dea3a9db248b4476feccdab21d6b05a4cd08f791df3efdc8f2f0440d15: Status 404 returned error can't find the container with id 8140f5dea3a9db248b4476feccdab21d6b05a4cd08f791df3efdc8f2f0440d15 Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.955456 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-4rrnv"] Dec 02 23:09:20 crc kubenswrapper[4903]: W1202 23:09:20.974930 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1730406_adf5_4f90_badf_6f40bec034eb.slice/crio-95c2b9ff2a163ba7246965f71ddf15bd41458afb87d18be537deacfece50e0fb WatchSource:0}: Error finding container 95c2b9ff2a163ba7246965f71ddf15bd41458afb87d18be537deacfece50e0fb: Status 404 returned error can't find the container with id 95c2b9ff2a163ba7246965f71ddf15bd41458afb87d18be537deacfece50e0fb Dec 02 23:09:20 crc kubenswrapper[4903]: I1202 23:09:20.999366 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" event={"ID":"7dec0455-1e61-4cbc-893d-600ca1526f90","Type":"ContainerStarted","Data":"8ad4d48b04821631cdaca50906f3c65686cf89c790273cc4c9034be3523f7597"} Dec 02 23:09:21 crc kubenswrapper[4903]: I1202 23:09:21.000391 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" event={"ID":"6cc0aefd-91b2-432d-8564-ab955a89620a","Type":"ContainerStarted","Data":"8140f5dea3a9db248b4476feccdab21d6b05a4cd08f791df3efdc8f2f0440d15"} Dec 02 23:09:21 crc kubenswrapper[4903]: I1202 23:09:21.001292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" event={"ID":"03107dd2-f5e5-4314-87ff-89c1f03811b2","Type":"ContainerStarted","Data":"c1936d89839e3976ce7b13324d929c6d61b77a1d08c00f96f853e5fb38ddd207"} Dec 02 23:09:21 crc kubenswrapper[4903]: I1202 23:09:21.002056 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" event={"ID":"9de21c8e-1da1-4105-83c9-c3a0d3fef062","Type":"ContainerStarted","Data":"f08c0ee6c99549ccf06f51661b6486a0561b9408d7be8a2fe4d8313f62ea5405"} Dec 02 23:09:21 crc kubenswrapper[4903]: I1202 23:09:21.002932 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" event={"ID":"c1730406-adf5-4f90-badf-6f40bec034eb","Type":"ContainerStarted","Data":"95c2b9ff2a163ba7246965f71ddf15bd41458afb87d18be537deacfece50e0fb"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.120218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" event={"ID":"9de21c8e-1da1-4105-83c9-c3a0d3fef062","Type":"ContainerStarted","Data":"3f13b63c0723a4cdb4fec4d908607d1dd6bb5a34a8a029cd6de5a81b5c158f4c"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.122476 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" event={"ID":"7dec0455-1e61-4cbc-893d-600ca1526f90","Type":"ContainerStarted","Data":"2e609893035557a3a664ccbdee85c3834af0c2d6cb1100fd4e5873dc707b0afd"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.123977 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" event={"ID":"c1730406-adf5-4f90-badf-6f40bec034eb","Type":"ContainerStarted","Data":"cb9195bb12ae28749de08ef73fd8e59745c9e5f46f16b5d5b839a7cf52f5f2fa"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.124081 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.125210 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" event={"ID":"6cc0aefd-91b2-432d-8564-ab955a89620a","Type":"ContainerStarted","Data":"f7623d5a9fe383c0e6d7c16d090cf7be73a296c9e9025745b6a5bdb6ac0e272e"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.125813 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.127041 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" event={"ID":"03107dd2-f5e5-4314-87ff-89c1f03811b2","Type":"ContainerStarted","Data":"81ec8a6af7f5d923ba0ab836a2d027f00038ae6371f3eb5ef27d4279d5d24e42"} Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.128304 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.147804 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-trl6v" podStartSLOduration=2.314799632 podStartE2EDuration="16.147789981s" podCreationTimestamp="2025-12-02 23:09:19 +0000 UTC" firstStartedPulling="2025-12-02 23:09:20.535534085 +0000 UTC m=+699.244088358" lastFinishedPulling="2025-12-02 23:09:34.368524424 +0000 UTC m=+713.077078707" observedRunningTime="2025-12-02 23:09:35.144830712 +0000 UTC m=+713.853384995" watchObservedRunningTime="2025-12-02 23:09:35.147789981 +0000 UTC m=+713.856344264" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.168855 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d68788f74-2nxbp" podStartSLOduration=2.406723785 podStartE2EDuration="16.168841187s" podCreationTimestamp="2025-12-02 23:09:19 +0000 UTC" firstStartedPulling="2025-12-02 23:09:20.606891273 +0000 UTC m=+699.315445556" lastFinishedPulling="2025-12-02 23:09:34.369008665 +0000 UTC m=+713.077562958" observedRunningTime="2025-12-02 23:09:35.166014691 +0000 UTC m=+713.874568974" watchObservedRunningTime="2025-12-02 23:09:35.168841187 +0000 UTC m=+713.877395470" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.207422 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bk4xc" podStartSLOduration=2.4121622 podStartE2EDuration="16.207407528s" podCreationTimestamp="2025-12-02 23:09:19 +0000 UTC" firstStartedPulling="2025-12-02 23:09:20.572426136 +0000 UTC m=+699.280980419" lastFinishedPulling="2025-12-02 23:09:34.367671464 +0000 UTC m=+713.076225747" observedRunningTime="2025-12-02 23:09:35.205088414 +0000 UTC m=+713.913642697" watchObservedRunningTime="2025-12-02 23:09:35.207407528 +0000 UTC m=+713.915961801" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.234173 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" podStartSLOduration=1.808141946 podStartE2EDuration="15.234153465s" podCreationTimestamp="2025-12-02 23:09:20 +0000 UTC" firstStartedPulling="2025-12-02 23:09:20.981361191 +0000 UTC m=+699.689915474" lastFinishedPulling="2025-12-02 23:09:34.40737271 +0000 UTC m=+713.115926993" observedRunningTime="2025-12-02 23:09:35.233127632 +0000 UTC m=+713.941681915" watchObservedRunningTime="2025-12-02 23:09:35.234153465 +0000 UTC m=+713.942707748" Dec 02 23:09:35 crc kubenswrapper[4903]: I1202 23:09:35.259349 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-4cm4x" podStartSLOduration=1.607074952 podStartE2EDuration="15.259332677s" podCreationTimestamp="2025-12-02 23:09:20 +0000 UTC" firstStartedPulling="2025-12-02 23:09:20.772261982 +0000 UTC m=+699.480816265" lastFinishedPulling="2025-12-02 23:09:34.424519697 +0000 UTC m=+713.133073990" observedRunningTime="2025-12-02 23:09:35.254108446 +0000 UTC m=+713.962662729" watchObservedRunningTime="2025-12-02 23:09:35.259332677 +0000 UTC m=+713.967886960" Dec 02 23:09:40 crc kubenswrapper[4903]: I1202 23:09:40.672207 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-4rrnv" Dec 02 23:09:53 crc kubenswrapper[4903]: I1202 23:09:53.070562 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:09:53 crc kubenswrapper[4903]: I1202 23:09:53.071374 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.338883 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn"] Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.340643 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.342860 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.354009 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn"] Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.424872 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmx2f\" (UniqueName: \"kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.424952 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.425222 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.525902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.525961 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmx2f\" (UniqueName: \"kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.525998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.526528 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.526553 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.552303 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmx2f\" (UniqueName: \"kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.664556 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:09:59 crc kubenswrapper[4903]: I1202 23:09:59.987020 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn"] Dec 02 23:10:00 crc kubenswrapper[4903]: I1202 23:10:00.280772 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" event={"ID":"7901882b-863d-4308-8099-8a199965bdbe","Type":"ContainerStarted","Data":"c546ccbdba2a6ee4e61debffbea403ba19031afe97cca96ad8efa1c8cba55dc2"} Dec 02 23:10:02 crc kubenswrapper[4903]: I1202 23:10:02.300024 4903 generic.go:334] "Generic (PLEG): container finished" podID="7901882b-863d-4308-8099-8a199965bdbe" containerID="066806453e341f73fa0bd070d329c35621f28be31024ca1d8d5b8be9cdc55e15" exitCode=0 Dec 02 23:10:02 crc kubenswrapper[4903]: I1202 23:10:02.300130 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" event={"ID":"7901882b-863d-4308-8099-8a199965bdbe","Type":"ContainerDied","Data":"066806453e341f73fa0bd070d329c35621f28be31024ca1d8d5b8be9cdc55e15"} Dec 02 23:10:04 crc kubenswrapper[4903]: I1202 23:10:04.317053 4903 generic.go:334] "Generic (PLEG): container finished" podID="7901882b-863d-4308-8099-8a199965bdbe" containerID="657fd18157c880f62849ee76db06632b3ebc7343be090baeab69fa843636f5a3" exitCode=0 Dec 02 23:10:04 crc kubenswrapper[4903]: I1202 23:10:04.317115 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" event={"ID":"7901882b-863d-4308-8099-8a199965bdbe","Type":"ContainerDied","Data":"657fd18157c880f62849ee76db06632b3ebc7343be090baeab69fa843636f5a3"} Dec 02 23:10:05 crc kubenswrapper[4903]: I1202 23:10:05.328296 4903 generic.go:334] "Generic (PLEG): container finished" podID="7901882b-863d-4308-8099-8a199965bdbe" containerID="340b8421e425a55a0061a090e271a0df986fd7d62bd6f8b2e31ccd4a62ebe002" exitCode=0 Dec 02 23:10:05 crc kubenswrapper[4903]: I1202 23:10:05.328361 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" event={"ID":"7901882b-863d-4308-8099-8a199965bdbe","Type":"ContainerDied","Data":"340b8421e425a55a0061a090e271a0df986fd7d62bd6f8b2e31ccd4a62ebe002"} Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.620912 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.731808 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle\") pod \"7901882b-863d-4308-8099-8a199965bdbe\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.731853 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util\") pod \"7901882b-863d-4308-8099-8a199965bdbe\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.731936 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmx2f\" (UniqueName: \"kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f\") pod \"7901882b-863d-4308-8099-8a199965bdbe\" (UID: \"7901882b-863d-4308-8099-8a199965bdbe\") " Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.732805 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle" (OuterVolumeSpecName: "bundle") pod "7901882b-863d-4308-8099-8a199965bdbe" (UID: "7901882b-863d-4308-8099-8a199965bdbe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.738367 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f" (OuterVolumeSpecName: "kube-api-access-lmx2f") pod "7901882b-863d-4308-8099-8a199965bdbe" (UID: "7901882b-863d-4308-8099-8a199965bdbe"). InnerVolumeSpecName "kube-api-access-lmx2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.746850 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util" (OuterVolumeSpecName: "util") pod "7901882b-863d-4308-8099-8a199965bdbe" (UID: "7901882b-863d-4308-8099-8a199965bdbe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.834577 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmx2f\" (UniqueName: \"kubernetes.io/projected/7901882b-863d-4308-8099-8a199965bdbe-kube-api-access-lmx2f\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.834626 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:06 crc kubenswrapper[4903]: I1202 23:10:06.834641 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7901882b-863d-4308-8099-8a199965bdbe-util\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:07 crc kubenswrapper[4903]: I1202 23:10:07.344894 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" event={"ID":"7901882b-863d-4308-8099-8a199965bdbe","Type":"ContainerDied","Data":"c546ccbdba2a6ee4e61debffbea403ba19031afe97cca96ad8efa1c8cba55dc2"} Dec 02 23:10:07 crc kubenswrapper[4903]: I1202 23:10:07.344970 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c546ccbdba2a6ee4e61debffbea403ba19031afe97cca96ad8efa1c8cba55dc2" Dec 02 23:10:07 crc kubenswrapper[4903]: I1202 23:10:07.345086 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.717224 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6"] Dec 02 23:10:10 crc kubenswrapper[4903]: E1202 23:10:10.717634 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="extract" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.717664 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="extract" Dec 02 23:10:10 crc kubenswrapper[4903]: E1202 23:10:10.717676 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="pull" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.717681 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="pull" Dec 02 23:10:10 crc kubenswrapper[4903]: E1202 23:10:10.717692 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="util" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.717698 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="util" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.717798 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7901882b-863d-4308-8099-8a199965bdbe" containerName="extract" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.718171 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.720085 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.720712 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7bnmp" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.722092 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.733516 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6"] Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.895461 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mldg\" (UniqueName: \"kubernetes.io/projected/524a5581-af2a-48b9-abd3-2f7c2d046b83-kube-api-access-9mldg\") pod \"nmstate-operator-5b5b58f5c8-gf4p6\" (UID: \"524a5581-af2a-48b9-abd3-2f7c2d046b83\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" Dec 02 23:10:10 crc kubenswrapper[4903]: I1202 23:10:10.996533 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mldg\" (UniqueName: \"kubernetes.io/projected/524a5581-af2a-48b9-abd3-2f7c2d046b83-kube-api-access-9mldg\") pod \"nmstate-operator-5b5b58f5c8-gf4p6\" (UID: \"524a5581-af2a-48b9-abd3-2f7c2d046b83\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" Dec 02 23:10:11 crc kubenswrapper[4903]: I1202 23:10:11.020485 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mldg\" (UniqueName: \"kubernetes.io/projected/524a5581-af2a-48b9-abd3-2f7c2d046b83-kube-api-access-9mldg\") pod \"nmstate-operator-5b5b58f5c8-gf4p6\" (UID: \"524a5581-af2a-48b9-abd3-2f7c2d046b83\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" Dec 02 23:10:11 crc kubenswrapper[4903]: I1202 23:10:11.031230 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" Dec 02 23:10:11 crc kubenswrapper[4903]: I1202 23:10:11.260086 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6"] Dec 02 23:10:11 crc kubenswrapper[4903]: W1202 23:10:11.273342 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524a5581_af2a_48b9_abd3_2f7c2d046b83.slice/crio-e06bbb3d8d789ea7e860cddf1b22378aef729a69a8491c65538fb756f78ee886 WatchSource:0}: Error finding container e06bbb3d8d789ea7e860cddf1b22378aef729a69a8491c65538fb756f78ee886: Status 404 returned error can't find the container with id e06bbb3d8d789ea7e860cddf1b22378aef729a69a8491c65538fb756f78ee886 Dec 02 23:10:11 crc kubenswrapper[4903]: I1202 23:10:11.373121 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" event={"ID":"524a5581-af2a-48b9-abd3-2f7c2d046b83","Type":"ContainerStarted","Data":"e06bbb3d8d789ea7e860cddf1b22378aef729a69a8491c65538fb756f78ee886"} Dec 02 23:10:14 crc kubenswrapper[4903]: I1202 23:10:14.223116 4903 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 23:10:14 crc kubenswrapper[4903]: I1202 23:10:14.398754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" event={"ID":"524a5581-af2a-48b9-abd3-2f7c2d046b83","Type":"ContainerStarted","Data":"593b1e6fd3b5b1e7ba3964955c57083f5b11caa8d037d360a9440bb9de7c2849"} Dec 02 23:10:14 crc kubenswrapper[4903]: I1202 23:10:14.420320 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gf4p6" podStartSLOduration=2.392522887 podStartE2EDuration="4.420304763s" podCreationTimestamp="2025-12-02 23:10:10 +0000 UTC" firstStartedPulling="2025-12-02 23:10:11.276969397 +0000 UTC m=+749.985523720" lastFinishedPulling="2025-12-02 23:10:13.304751293 +0000 UTC m=+752.013305596" observedRunningTime="2025-12-02 23:10:14.415604696 +0000 UTC m=+753.124158969" watchObservedRunningTime="2025-12-02 23:10:14.420304763 +0000 UTC m=+753.128859046" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.724066 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.726052 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.732362 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-kdkmq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.748030 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.749207 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd6nn\" (UniqueName: \"kubernetes.io/projected/09e08c1d-fdea-4255-accb-8c957d34cfa3-kube-api-access-jd6nn\") pod \"nmstate-metrics-7f946cbc9-8qx6t\" (UID: \"09e08c1d-fdea-4255-accb-8c957d34cfa3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.759415 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.763906 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.772591 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-smkbq"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.773493 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.774133 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.791150 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.850675 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d56s\" (UniqueName: \"kubernetes.io/projected/74c7c4ac-2173-497a-b630-d905326c4749-kube-api-access-2d56s\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.850905 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-dbus-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.850941 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/74c7c4ac-2173-497a-b630-d905326c4749-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.850965 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4ld\" (UniqueName: \"kubernetes.io/projected/e904a523-8784-443d-b994-bb1aa11e45f4-kube-api-access-zq4ld\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.851142 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-ovs-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.851205 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd6nn\" (UniqueName: \"kubernetes.io/projected/09e08c1d-fdea-4255-accb-8c957d34cfa3-kube-api-access-jd6nn\") pod \"nmstate-metrics-7f946cbc9-8qx6t\" (UID: \"09e08c1d-fdea-4255-accb-8c957d34cfa3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.851230 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-nmstate-lock\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.864214 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.865170 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.867004 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.867081 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.876779 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xv5sw" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.884277 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd6nn\" (UniqueName: \"kubernetes.io/projected/09e08c1d-fdea-4255-accb-8c957d34cfa3-kube-api-access-jd6nn\") pod \"nmstate-metrics-7f946cbc9-8qx6t\" (UID: \"09e08c1d-fdea-4255-accb-8c957d34cfa3\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.886091 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6"] Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.951976 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-ovs-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952016 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-nmstate-lock\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952042 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d56s\" (UniqueName: \"kubernetes.io/projected/74c7c4ac-2173-497a-b630-d905326c4749-kube-api-access-2d56s\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952056 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-dbus-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952078 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2smh\" (UniqueName: \"kubernetes.io/projected/ef8f8d87-b435-4583-aa22-2e43892ce34b-kube-api-access-k2smh\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952103 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/74c7c4ac-2173-497a-b630-d905326c4749-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952107 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-ovs-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-nmstate-lock\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952172 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4ld\" (UniqueName: \"kubernetes.io/projected/e904a523-8784-443d-b994-bb1aa11e45f4-kube-api-access-zq4ld\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952260 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef8f8d87-b435-4583-aa22-2e43892ce34b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952289 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8f8d87-b435-4583-aa22-2e43892ce34b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.952402 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e904a523-8784-443d-b994-bb1aa11e45f4-dbus-socket\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.959805 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/74c7c4ac-2173-497a-b630-d905326c4749-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.967924 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4ld\" (UniqueName: \"kubernetes.io/projected/e904a523-8784-443d-b994-bb1aa11e45f4-kube-api-access-zq4ld\") pod \"nmstate-handler-smkbq\" (UID: \"e904a523-8784-443d-b994-bb1aa11e45f4\") " pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:20 crc kubenswrapper[4903]: I1202 23:10:20.970030 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d56s\" (UniqueName: \"kubernetes.io/projected/74c7c4ac-2173-497a-b630-d905326c4749-kube-api-access-2d56s\") pod \"nmstate-webhook-5f6d4c5ccb-5lv4b\" (UID: \"74c7c4ac-2173-497a-b630-d905326c4749\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.051977 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-597d46d97d-n4dg6"] Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.052839 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.052998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8f8d87-b435-4583-aa22-2e43892ce34b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.053080 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2smh\" (UniqueName: \"kubernetes.io/projected/ef8f8d87-b435-4583-aa22-2e43892ce34b-kube-api-access-k2smh\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.053125 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef8f8d87-b435-4583-aa22-2e43892ce34b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.054047 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ef8f8d87-b435-4583-aa22-2e43892ce34b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.057909 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8f8d87-b435-4583-aa22-2e43892ce34b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.060416 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-597d46d97d-n4dg6"] Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.076032 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.076977 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2smh\" (UniqueName: \"kubernetes.io/projected/ef8f8d87-b435-4583-aa22-2e43892ce34b-kube-api-access-k2smh\") pod \"nmstate-console-plugin-7fbb5f6569-wwvz6\" (UID: \"ef8f8d87-b435-4583-aa22-2e43892ce34b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.101078 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.120030 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154438 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-trusted-ca-bundle\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154678 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-oauth-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-service-ca\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154758 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154792 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-console-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154827 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-oauth-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.154853 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnkm\" (UniqueName: \"kubernetes.io/projected/397855ce-d44e-4a42-8209-53938c9b4c2b-kube-api-access-qdnkm\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.180506 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256107 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256149 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-console-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256176 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-oauth-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256194 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnkm\" (UniqueName: \"kubernetes.io/projected/397855ce-d44e-4a42-8209-53938c9b4c2b-kube-api-access-qdnkm\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256227 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-trusted-ca-bundle\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256252 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-oauth-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.256272 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-service-ca\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.257571 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-console-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.257594 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-service-ca\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.258140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-oauth-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.258319 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/397855ce-d44e-4a42-8209-53938c9b4c2b-trusted-ca-bundle\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.260634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-serving-cert\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.264205 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/397855ce-d44e-4a42-8209-53938c9b4c2b-console-oauth-config\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.272809 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnkm\" (UniqueName: \"kubernetes.io/projected/397855ce-d44e-4a42-8209-53938c9b4c2b-kube-api-access-qdnkm\") pod \"console-597d46d97d-n4dg6\" (UID: \"397855ce-d44e-4a42-8209-53938c9b4c2b\") " pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.343986 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b"] Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.380953 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6"] Dec 02 23:10:21 crc kubenswrapper[4903]: W1202 23:10:21.385344 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8f8d87_b435_4583_aa22_2e43892ce34b.slice/crio-e738b9792cd36ac9876a5e089eb0152e92062279b5bf7390d88a0346fc396431 WatchSource:0}: Error finding container e738b9792cd36ac9876a5e089eb0152e92062279b5bf7390d88a0346fc396431: Status 404 returned error can't find the container with id e738b9792cd36ac9876a5e089eb0152e92062279b5bf7390d88a0346fc396431 Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.428129 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.459385 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" event={"ID":"74c7c4ac-2173-497a-b630-d905326c4749","Type":"ContainerStarted","Data":"c48192ee66173480c3856daeeb4f437abc9ef859953d201a413373fcbe683e2b"} Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.460074 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" event={"ID":"ef8f8d87-b435-4583-aa22-2e43892ce34b","Type":"ContainerStarted","Data":"e738b9792cd36ac9876a5e089eb0152e92062279b5bf7390d88a0346fc396431"} Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.460903 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-smkbq" event={"ID":"e904a523-8784-443d-b994-bb1aa11e45f4","Type":"ContainerStarted","Data":"511c61737e5ec33aa2804161f842e48129fd3a8c3a2e7afd0b973a444a7ce652"} Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.478860 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t"] Dec 02 23:10:21 crc kubenswrapper[4903]: W1202 23:10:21.484292 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09e08c1d_fdea_4255_accb_8c957d34cfa3.slice/crio-4dc66738d76f7a2f74553b8d2661e8f14fad7a52005a9eb2ca395861afc5afdf WatchSource:0}: Error finding container 4dc66738d76f7a2f74553b8d2661e8f14fad7a52005a9eb2ca395861afc5afdf: Status 404 returned error can't find the container with id 4dc66738d76f7a2f74553b8d2661e8f14fad7a52005a9eb2ca395861afc5afdf Dec 02 23:10:21 crc kubenswrapper[4903]: I1202 23:10:21.650273 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-597d46d97d-n4dg6"] Dec 02 23:10:22 crc kubenswrapper[4903]: I1202 23:10:22.475329 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" event={"ID":"09e08c1d-fdea-4255-accb-8c957d34cfa3","Type":"ContainerStarted","Data":"4dc66738d76f7a2f74553b8d2661e8f14fad7a52005a9eb2ca395861afc5afdf"} Dec 02 23:10:22 crc kubenswrapper[4903]: I1202 23:10:22.479572 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-597d46d97d-n4dg6" event={"ID":"397855ce-d44e-4a42-8209-53938c9b4c2b","Type":"ContainerStarted","Data":"9296826d95b5ad588c06d69b4f140d0cf094597c96fd00a702d305dcf45be5c8"} Dec 02 23:10:22 crc kubenswrapper[4903]: I1202 23:10:22.479685 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-597d46d97d-n4dg6" event={"ID":"397855ce-d44e-4a42-8209-53938c9b4c2b","Type":"ContainerStarted","Data":"00942c460a873fd37785e9fe363645ca3dbd5f92fe8169084e003f70b2d3d711"} Dec 02 23:10:22 crc kubenswrapper[4903]: I1202 23:10:22.503196 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-597d46d97d-n4dg6" podStartSLOduration=1.5031676470000002 podStartE2EDuration="1.503167647s" podCreationTimestamp="2025-12-02 23:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:10:22.500120483 +0000 UTC m=+761.208674806" watchObservedRunningTime="2025-12-02 23:10:22.503167647 +0000 UTC m=+761.211721980" Dec 02 23:10:23 crc kubenswrapper[4903]: I1202 23:10:23.071129 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:10:23 crc kubenswrapper[4903]: I1202 23:10:23.071598 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.497224 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" event={"ID":"ef8f8d87-b435-4583-aa22-2e43892ce34b","Type":"ContainerStarted","Data":"fd89ea65eeb6b1f42e5c0cd9603e156917872169f6be64efb35a926c35181095"} Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.499096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" event={"ID":"09e08c1d-fdea-4255-accb-8c957d34cfa3","Type":"ContainerStarted","Data":"a8df24680a09f52f6114dc352409aa0f4c9c31d3a5de341f4d866842b3903f7f"} Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.500360 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" event={"ID":"74c7c4ac-2173-497a-b630-d905326c4749","Type":"ContainerStarted","Data":"f5ebe884367c4ce55d267804ca3dac7dfc0927699c5d5cfb75c67726f5b87d30"} Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.500534 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.515026 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wwvz6" podStartSLOduration=1.7049930739999999 podStartE2EDuration="4.515008023s" podCreationTimestamp="2025-12-02 23:10:20 +0000 UTC" firstStartedPulling="2025-12-02 23:10:21.386895125 +0000 UTC m=+760.095449408" lastFinishedPulling="2025-12-02 23:10:24.196910074 +0000 UTC m=+762.905464357" observedRunningTime="2025-12-02 23:10:24.511515234 +0000 UTC m=+763.220069567" watchObservedRunningTime="2025-12-02 23:10:24.515008023 +0000 UTC m=+763.223562316" Dec 02 23:10:24 crc kubenswrapper[4903]: I1202 23:10:24.538346 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" podStartSLOduration=1.680107838 podStartE2EDuration="4.538328239s" podCreationTimestamp="2025-12-02 23:10:20 +0000 UTC" firstStartedPulling="2025-12-02 23:10:21.348550984 +0000 UTC m=+760.057105267" lastFinishedPulling="2025-12-02 23:10:24.206771385 +0000 UTC m=+762.915325668" observedRunningTime="2025-12-02 23:10:24.537843163 +0000 UTC m=+763.246397466" watchObservedRunningTime="2025-12-02 23:10:24.538328239 +0000 UTC m=+763.246882532" Dec 02 23:10:25 crc kubenswrapper[4903]: I1202 23:10:25.507435 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-smkbq" event={"ID":"e904a523-8784-443d-b994-bb1aa11e45f4","Type":"ContainerStarted","Data":"00a7fa186048a00764e0b659dadcc6b794d6e788e06c28f04a223f0b8dbc87a9"} Dec 02 23:10:25 crc kubenswrapper[4903]: I1202 23:10:25.532478 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-smkbq" podStartSLOduration=2.482981397 podStartE2EDuration="5.532461183s" podCreationTimestamp="2025-12-02 23:10:20 +0000 UTC" firstStartedPulling="2025-12-02 23:10:21.156246744 +0000 UTC m=+759.864801027" lastFinishedPulling="2025-12-02 23:10:24.20572653 +0000 UTC m=+762.914280813" observedRunningTime="2025-12-02 23:10:25.527721435 +0000 UTC m=+764.236275768" watchObservedRunningTime="2025-12-02 23:10:25.532461183 +0000 UTC m=+764.241015476" Dec 02 23:10:26 crc kubenswrapper[4903]: I1202 23:10:26.121234 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:27 crc kubenswrapper[4903]: I1202 23:10:27.525646 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" event={"ID":"09e08c1d-fdea-4255-accb-8c957d34cfa3","Type":"ContainerStarted","Data":"d5d187543ef56d2d425e452b35663a4708cf059ce7c2a4a42015f6c0a6467a60"} Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.253722 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-smkbq" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.274548 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8qx6t" podStartSLOduration=6.371837177 podStartE2EDuration="11.274532415s" podCreationTimestamp="2025-12-02 23:10:20 +0000 UTC" firstStartedPulling="2025-12-02 23:10:21.486601371 +0000 UTC m=+760.195155654" lastFinishedPulling="2025-12-02 23:10:26.389296609 +0000 UTC m=+765.097850892" observedRunningTime="2025-12-02 23:10:27.549473944 +0000 UTC m=+766.258028267" watchObservedRunningTime="2025-12-02 23:10:31.274532415 +0000 UTC m=+769.983086698" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.428719 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.428812 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.435909 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.560763 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-597d46d97d-n4dg6" Dec 02 23:10:31 crc kubenswrapper[4903]: I1202 23:10:31.661338 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 23:10:41 crc kubenswrapper[4903]: I1202 23:10:41.110279 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5lv4b" Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.070469 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.071191 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.071261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.072210 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.072325 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522" gracePeriod=600 Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.712368 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522" exitCode=0 Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.712483 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522"} Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.713000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c"} Dec 02 23:10:53 crc kubenswrapper[4903]: I1202 23:10:53.713027 4903 scope.go:117] "RemoveContainer" containerID="8cabb628dd17c6e63215bea49dda3d3c1e8e0058056824e16595b5a143ecbd3e" Dec 02 23:10:56 crc kubenswrapper[4903]: I1202 23:10:56.713781 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tjt6x" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerName="console" containerID="cri-o://5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48" gracePeriod=15 Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.605478 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tjt6x_d131fd77-f36a-4c9a-8578-5b2c62e5d356/console/0.log" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.605545 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746249 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tjt6x_d131fd77-f36a-4c9a-8578-5b2c62e5d356/console/0.log" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746293 4903 generic.go:334] "Generic (PLEG): container finished" podID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerID="5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48" exitCode=2 Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746318 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tjt6x" event={"ID":"d131fd77-f36a-4c9a-8578-5b2c62e5d356","Type":"ContainerDied","Data":"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48"} Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746343 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tjt6x" event={"ID":"d131fd77-f36a-4c9a-8578-5b2c62e5d356","Type":"ContainerDied","Data":"be6e2815e91f92633317158e5e37dbb6d4a4a6f7662e148ecbdd15bb5eb50caa"} Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746358 4903 scope.go:117] "RemoveContainer" containerID="5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.746406 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tjt6x" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.769389 4903 scope.go:117] "RemoveContainer" containerID="5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48" Dec 02 23:10:57 crc kubenswrapper[4903]: E1202 23:10:57.770107 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48\": container with ID starting with 5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48 not found: ID does not exist" containerID="5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.770157 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48"} err="failed to get container status \"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48\": rpc error: code = NotFound desc = could not find container \"5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48\": container with ID starting with 5cc31766fc632e6393c370a36959eda5aa888d27fc1301951351c6ec5aad1f48 not found: ID does not exist" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794144 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794191 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794228 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794264 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794280 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794295 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.794349 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f8sn\" (UniqueName: \"kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn\") pod \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\" (UID: \"d131fd77-f36a-4c9a-8578-5b2c62e5d356\") " Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.795363 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca" (OuterVolumeSpecName: "service-ca") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.795382 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config" (OuterVolumeSpecName: "console-config") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.796610 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.797422 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.803161 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.804288 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.805067 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn" (OuterVolumeSpecName: "kube-api-access-2f8sn") pod "d131fd77-f36a-4c9a-8578-5b2c62e5d356" (UID: "d131fd77-f36a-4c9a-8578-5b2c62e5d356"). InnerVolumeSpecName "kube-api-access-2f8sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898607 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f8sn\" (UniqueName: \"kubernetes.io/projected/d131fd77-f36a-4c9a-8578-5b2c62e5d356-kube-api-access-2f8sn\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898643 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898669 4903 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898677 4903 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898685 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898695 4903 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.898713 4903 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d131fd77-f36a-4c9a-8578-5b2c62e5d356-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.926998 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss"] Dec 02 23:10:57 crc kubenswrapper[4903]: E1202 23:10:57.927298 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerName="console" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.927318 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerName="console" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.927445 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" containerName="console" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.928483 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.930540 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 23:10:57 crc kubenswrapper[4903]: I1202 23:10:57.936469 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss"] Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.070275 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.073588 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tjt6x"] Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.101311 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.101376 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8k2\" (UniqueName: \"kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.101403 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.202775 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.202860 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp8k2\" (UniqueName: \"kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.202896 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.203383 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.203466 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.220590 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp8k2\" (UniqueName: \"kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.282563 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.577831 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss"] Dec 02 23:10:58 crc kubenswrapper[4903]: I1202 23:10:58.755917 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" event={"ID":"6fb28505-94cf-4bc3-add6-f11756acc2b6","Type":"ContainerStarted","Data":"a70c781aacbf7a2ba68e1b6f082fcad9d6dcab642eeea1055313bcfa44b3b56d"} Dec 02 23:10:59 crc kubenswrapper[4903]: I1202 23:10:59.626021 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d131fd77-f36a-4c9a-8578-5b2c62e5d356" path="/var/lib/kubelet/pods/d131fd77-f36a-4c9a-8578-5b2c62e5d356/volumes" Dec 02 23:10:59 crc kubenswrapper[4903]: I1202 23:10:59.766907 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerID="b1ed8e206783536e10d0342d1faf35d1ea67ec6edde6f320138f9b184ef8e555" exitCode=0 Dec 02 23:10:59 crc kubenswrapper[4903]: I1202 23:10:59.766979 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" event={"ID":"6fb28505-94cf-4bc3-add6-f11756acc2b6","Type":"ContainerDied","Data":"b1ed8e206783536e10d0342d1faf35d1ea67ec6edde6f320138f9b184ef8e555"} Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.254080 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.256081 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.270498 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.446803 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjkkp\" (UniqueName: \"kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.446952 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.446993 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.548430 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkkp\" (UniqueName: \"kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.548517 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.548540 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.549076 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.549319 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.572828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjkkp\" (UniqueName: \"kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp\") pod \"redhat-operators-5grnp\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.586467 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.782714 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerID="d99f18b88849ee7cc7ead7900257a06ad4a9518d371a80a9848646af5ebc38c9" exitCode=0 Dec 02 23:11:01 crc kubenswrapper[4903]: I1202 23:11:01.782822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" event={"ID":"6fb28505-94cf-4bc3-add6-f11756acc2b6","Type":"ContainerDied","Data":"d99f18b88849ee7cc7ead7900257a06ad4a9518d371a80a9848646af5ebc38c9"} Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.067166 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.792429 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerID="d5678ec5354955de07f0bbb410c5b36c4ce37f60119c64df26d347914d236b0e" exitCode=0 Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.792532 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" event={"ID":"6fb28505-94cf-4bc3-add6-f11756acc2b6","Type":"ContainerDied","Data":"d5678ec5354955de07f0bbb410c5b36c4ce37f60119c64df26d347914d236b0e"} Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.798163 4903 generic.go:334] "Generic (PLEG): container finished" podID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerID="f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86" exitCode=0 Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.798219 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerDied","Data":"f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86"} Dec 02 23:11:02 crc kubenswrapper[4903]: I1202 23:11:02.798251 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerStarted","Data":"6f7bd129079a620b525fbd37544f585c8e67918addd21b2fff2fefcfaf5ddd3e"} Dec 02 23:11:03 crc kubenswrapper[4903]: I1202 23:11:03.807904 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerStarted","Data":"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0"} Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.141527 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.299078 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp8k2\" (UniqueName: \"kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2\") pod \"6fb28505-94cf-4bc3-add6-f11756acc2b6\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.299136 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle\") pod \"6fb28505-94cf-4bc3-add6-f11756acc2b6\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.299191 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util\") pod \"6fb28505-94cf-4bc3-add6-f11756acc2b6\" (UID: \"6fb28505-94cf-4bc3-add6-f11756acc2b6\") " Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.304413 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle" (OuterVolumeSpecName: "bundle") pod "6fb28505-94cf-4bc3-add6-f11756acc2b6" (UID: "6fb28505-94cf-4bc3-add6-f11756acc2b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.310172 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2" (OuterVolumeSpecName: "kube-api-access-cp8k2") pod "6fb28505-94cf-4bc3-add6-f11756acc2b6" (UID: "6fb28505-94cf-4bc3-add6-f11756acc2b6"). InnerVolumeSpecName "kube-api-access-cp8k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.401714 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp8k2\" (UniqueName: \"kubernetes.io/projected/6fb28505-94cf-4bc3-add6-f11756acc2b6-kube-api-access-cp8k2\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.401777 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.561908 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util" (OuterVolumeSpecName: "util") pod "6fb28505-94cf-4bc3-add6-f11756acc2b6" (UID: "6fb28505-94cf-4bc3-add6-f11756acc2b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.604623 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fb28505-94cf-4bc3-add6-f11756acc2b6-util\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.821148 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" event={"ID":"6fb28505-94cf-4bc3-add6-f11756acc2b6","Type":"ContainerDied","Data":"a70c781aacbf7a2ba68e1b6f082fcad9d6dcab642eeea1055313bcfa44b3b56d"} Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.821509 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70c781aacbf7a2ba68e1b6f082fcad9d6dcab642eeea1055313bcfa44b3b56d" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.821186 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss" Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.824156 4903 generic.go:334] "Generic (PLEG): container finished" podID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerID="28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0" exitCode=0 Dec 02 23:11:04 crc kubenswrapper[4903]: I1202 23:11:04.824220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerDied","Data":"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0"} Dec 02 23:11:05 crc kubenswrapper[4903]: I1202 23:11:05.838073 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerStarted","Data":"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b"} Dec 02 23:11:11 crc kubenswrapper[4903]: I1202 23:11:11.587027 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:11 crc kubenswrapper[4903]: I1202 23:11:11.587101 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:12 crc kubenswrapper[4903]: I1202 23:11:12.667608 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5grnp" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="registry-server" probeResult="failure" output=< Dec 02 23:11:12 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:11:12 crc kubenswrapper[4903]: > Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.996202 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5grnp" podStartSLOduration=11.583883574 podStartE2EDuration="13.996180218s" podCreationTimestamp="2025-12-02 23:11:01 +0000 UTC" firstStartedPulling="2025-12-02 23:11:02.801822714 +0000 UTC m=+801.510377007" lastFinishedPulling="2025-12-02 23:11:05.214119328 +0000 UTC m=+803.922673651" observedRunningTime="2025-12-02 23:11:05.866508999 +0000 UTC m=+804.575063392" watchObservedRunningTime="2025-12-02 23:11:14.996180218 +0000 UTC m=+813.704734501" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.996738 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg"] Dec 02 23:11:14 crc kubenswrapper[4903]: E1202 23:11:14.996969 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="util" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.996987 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="util" Dec 02 23:11:14 crc kubenswrapper[4903]: E1202 23:11:14.996996 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="extract" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.997002 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="extract" Dec 02 23:11:14 crc kubenswrapper[4903]: E1202 23:11:14.997031 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="pull" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.997039 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="pull" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.997152 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb28505-94cf-4bc3-add6-f11756acc2b6" containerName="extract" Dec 02 23:11:14 crc kubenswrapper[4903]: I1202 23:11:14.997585 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:14.999963 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.000134 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.000442 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.000565 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.000766 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-g2jmj" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.006424 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg"] Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.157752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-webhook-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.158083 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpgtv\" (UniqueName: \"kubernetes.io/projected/1548db98-cd62-4c58-88ac-4f4de9512edb-kube-api-access-rpgtv\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.158316 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-apiservice-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.243321 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64ddb78498-frglc"] Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.244246 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.246098 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.246330 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-88bl8" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.248107 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.259315 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-apiservice-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.260152 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-webhook-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.260186 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpgtv\" (UniqueName: \"kubernetes.io/projected/1548db98-cd62-4c58-88ac-4f4de9512edb-kube-api-access-rpgtv\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.263274 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64ddb78498-frglc"] Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.265054 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-webhook-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.265184 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1548db98-cd62-4c58-88ac-4f4de9512edb-apiservice-cert\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.318608 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpgtv\" (UniqueName: \"kubernetes.io/projected/1548db98-cd62-4c58-88ac-4f4de9512edb-kube-api-access-rpgtv\") pod \"metallb-operator-controller-manager-7b49745895-c8xsg\" (UID: \"1548db98-cd62-4c58-88ac-4f4de9512edb\") " pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.361613 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-apiservice-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.361673 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjs7\" (UniqueName: \"kubernetes.io/projected/2f9aa142-f989-4748-946c-7629a225d6a4-kube-api-access-7pjs7\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.361701 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-webhook-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.462716 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-webhook-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.462910 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-apiservice-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.462959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjs7\" (UniqueName: \"kubernetes.io/projected/2f9aa142-f989-4748-946c-7629a225d6a4-kube-api-access-7pjs7\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.467000 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-webhook-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.467345 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f9aa142-f989-4748-946c-7629a225d6a4-apiservice-cert\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.489451 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjs7\" (UniqueName: \"kubernetes.io/projected/2f9aa142-f989-4748-946c-7629a225d6a4-kube-api-access-7pjs7\") pod \"metallb-operator-webhook-server-64ddb78498-frglc\" (UID: \"2f9aa142-f989-4748-946c-7629a225d6a4\") " pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.597882 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:15 crc kubenswrapper[4903]: I1202 23:11:15.616163 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:16 crc kubenswrapper[4903]: I1202 23:11:16.046848 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64ddb78498-frglc"] Dec 02 23:11:16 crc kubenswrapper[4903]: W1202 23:11:16.051703 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f9aa142_f989_4748_946c_7629a225d6a4.slice/crio-f231881cd4ff3fb6b102b74bb751d7152fd2e36667bccbbd88f5d83f3daae8f9 WatchSource:0}: Error finding container f231881cd4ff3fb6b102b74bb751d7152fd2e36667bccbbd88f5d83f3daae8f9: Status 404 returned error can't find the container with id f231881cd4ff3fb6b102b74bb751d7152fd2e36667bccbbd88f5d83f3daae8f9 Dec 02 23:11:16 crc kubenswrapper[4903]: I1202 23:11:16.053270 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg"] Dec 02 23:11:16 crc kubenswrapper[4903]: W1202 23:11:16.055025 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1548db98_cd62_4c58_88ac_4f4de9512edb.slice/crio-b98bd93bf8d5ab9b3312474e028c47659bbffd58da6304e9dd96105d2edc445b WatchSource:0}: Error finding container b98bd93bf8d5ab9b3312474e028c47659bbffd58da6304e9dd96105d2edc445b: Status 404 returned error can't find the container with id b98bd93bf8d5ab9b3312474e028c47659bbffd58da6304e9dd96105d2edc445b Dec 02 23:11:16 crc kubenswrapper[4903]: I1202 23:11:16.902218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" event={"ID":"1548db98-cd62-4c58-88ac-4f4de9512edb","Type":"ContainerStarted","Data":"b98bd93bf8d5ab9b3312474e028c47659bbffd58da6304e9dd96105d2edc445b"} Dec 02 23:11:16 crc kubenswrapper[4903]: I1202 23:11:16.903306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" event={"ID":"2f9aa142-f989-4748-946c-7629a225d6a4","Type":"ContainerStarted","Data":"f231881cd4ff3fb6b102b74bb751d7152fd2e36667bccbbd88f5d83f3daae8f9"} Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.628001 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.687860 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.730524 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.933310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" event={"ID":"1548db98-cd62-4c58-88ac-4f4de9512edb","Type":"ContainerStarted","Data":"ce60dba637c2097d948ee056d80f8790d0f132aaa847a2675e7073dfbeaf8aa4"} Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.933841 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:21 crc kubenswrapper[4903]: I1202 23:11:21.960634 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" podStartSLOduration=2.265973613 podStartE2EDuration="7.960617145s" podCreationTimestamp="2025-12-02 23:11:14 +0000 UTC" firstStartedPulling="2025-12-02 23:11:16.059882375 +0000 UTC m=+814.768436658" lastFinishedPulling="2025-12-02 23:11:21.754525907 +0000 UTC m=+820.463080190" observedRunningTime="2025-12-02 23:11:21.953026819 +0000 UTC m=+820.661581112" watchObservedRunningTime="2025-12-02 23:11:21.960617145 +0000 UTC m=+820.669171438" Dec 02 23:11:22 crc kubenswrapper[4903]: I1202 23:11:22.946822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" event={"ID":"2f9aa142-f989-4748-946c-7629a225d6a4","Type":"ContainerStarted","Data":"c3702f1fe6dfb3aee6788fc2225c76008efbe9bcd6ca8b0b4e6df3c2fd53b5ff"} Dec 02 23:11:22 crc kubenswrapper[4903]: I1202 23:11:22.948201 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5grnp" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="registry-server" containerID="cri-o://10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b" gracePeriod=2 Dec 02 23:11:22 crc kubenswrapper[4903]: I1202 23:11:22.983096 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" podStartSLOduration=2.266884632 podStartE2EDuration="7.983075882s" podCreationTimestamp="2025-12-02 23:11:15 +0000 UTC" firstStartedPulling="2025-12-02 23:11:16.05560483 +0000 UTC m=+814.764159113" lastFinishedPulling="2025-12-02 23:11:21.77179608 +0000 UTC m=+820.480350363" observedRunningTime="2025-12-02 23:11:22.977838774 +0000 UTC m=+821.686393067" watchObservedRunningTime="2025-12-02 23:11:22.983075882 +0000 UTC m=+821.691630165" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.372701 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.476639 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjkkp\" (UniqueName: \"kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp\") pod \"9a294326-5ec3-43ee-97db-ae804db72fa8\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.477786 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content\") pod \"9a294326-5ec3-43ee-97db-ae804db72fa8\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.477847 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities\") pod \"9a294326-5ec3-43ee-97db-ae804db72fa8\" (UID: \"9a294326-5ec3-43ee-97db-ae804db72fa8\") " Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.479195 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities" (OuterVolumeSpecName: "utilities") pod "9a294326-5ec3-43ee-97db-ae804db72fa8" (UID: "9a294326-5ec3-43ee-97db-ae804db72fa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.497866 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp" (OuterVolumeSpecName: "kube-api-access-qjkkp") pod "9a294326-5ec3-43ee-97db-ae804db72fa8" (UID: "9a294326-5ec3-43ee-97db-ae804db72fa8"). InnerVolumeSpecName "kube-api-access-qjkkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.579991 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.580035 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjkkp\" (UniqueName: \"kubernetes.io/projected/9a294326-5ec3-43ee-97db-ae804db72fa8-kube-api-access-qjkkp\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.582207 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a294326-5ec3-43ee-97db-ae804db72fa8" (UID: "9a294326-5ec3-43ee-97db-ae804db72fa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.681861 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a294326-5ec3-43ee-97db-ae804db72fa8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955752 4903 generic.go:334] "Generic (PLEG): container finished" podID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerID="10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b" exitCode=0 Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955856 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5grnp" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955860 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerDied","Data":"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b"} Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955937 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5grnp" event={"ID":"9a294326-5ec3-43ee-97db-ae804db72fa8","Type":"ContainerDied","Data":"6f7bd129079a620b525fbd37544f585c8e67918addd21b2fff2fefcfaf5ddd3e"} Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955974 4903 scope.go:117] "RemoveContainer" containerID="10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.955982 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.979855 4903 scope.go:117] "RemoveContainer" containerID="28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0" Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.979876 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:23 crc kubenswrapper[4903]: I1202 23:11:23.987677 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5grnp"] Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.005299 4903 scope.go:117] "RemoveContainer" containerID="f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.021740 4903 scope.go:117] "RemoveContainer" containerID="10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b" Dec 02 23:11:24 crc kubenswrapper[4903]: E1202 23:11:24.022181 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b\": container with ID starting with 10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b not found: ID does not exist" containerID="10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.022216 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b"} err="failed to get container status \"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b\": rpc error: code = NotFound desc = could not find container \"10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b\": container with ID starting with 10e45bd82fad6c0831372c4fc782ad45f66d640fe1c8eac562583e74be03193b not found: ID does not exist" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.022243 4903 scope.go:117] "RemoveContainer" containerID="28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0" Dec 02 23:11:24 crc kubenswrapper[4903]: E1202 23:11:24.022643 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0\": container with ID starting with 28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0 not found: ID does not exist" containerID="28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.022739 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0"} err="failed to get container status \"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0\": rpc error: code = NotFound desc = could not find container \"28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0\": container with ID starting with 28276fe3aa816b7333ace2a76af5f6c7c67cd536ce25b3fe5c52b2cdfbbf3ac0 not found: ID does not exist" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.022772 4903 scope.go:117] "RemoveContainer" containerID="f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86" Dec 02 23:11:24 crc kubenswrapper[4903]: E1202 23:11:24.023146 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86\": container with ID starting with f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86 not found: ID does not exist" containerID="f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86" Dec 02 23:11:24 crc kubenswrapper[4903]: I1202 23:11:24.023177 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86"} err="failed to get container status \"f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86\": rpc error: code = NotFound desc = could not find container \"f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86\": container with ID starting with f5ab0787451f192120773f34e3badee061f8dee00e2fc0cc92ab2ae30d375d86 not found: ID does not exist" Dec 02 23:11:25 crc kubenswrapper[4903]: I1202 23:11:25.629482 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" path="/var/lib/kubelet/pods/9a294326-5ec3-43ee-97db-ae804db72fa8/volumes" Dec 02 23:11:35 crc kubenswrapper[4903]: I1202 23:11:35.607093 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64ddb78498-frglc" Dec 02 23:11:55 crc kubenswrapper[4903]: I1202 23:11:55.621120 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b49745895-c8xsg" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.529267 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67"] Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.529931 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="registry-server" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.530074 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="registry-server" Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.530314 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="extract-utilities" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.530461 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="extract-utilities" Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.530604 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="extract-content" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.530776 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="extract-content" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.531093 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a294326-5ec3-43ee-97db-ae804db72fa8" containerName="registry-server" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.531899 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.537670 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.537862 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rj5xv" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.539406 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67"] Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.550894 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-gh95g"] Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.562068 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.573579 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.573915 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584395 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-metrics\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584475 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-reloader\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584505 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-conf\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584566 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584589 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-sockets\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlg68\" (UniqueName: \"kubernetes.io/projected/b8787de7-b1d1-41fc-bda7-628c8916c8c7-kube-api-access-tlg68\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584692 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blg9q\" (UniqueName: \"kubernetes.io/projected/93baaa1e-7108-463e-82c3-71abc3a34678-kube-api-access-blg9q\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584846 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93baaa1e-7108-463e-82c3-71abc3a34678-frr-startup\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.584953 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93baaa1e-7108-463e-82c3-71abc3a34678-metrics-certs\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.632559 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pm26w"] Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.633467 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.638786 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.638963 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.639074 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l2xzw" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.639192 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.655007 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-b7sxc"] Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.659328 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.663458 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687088 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-cert\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687149 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8656t\" (UniqueName: \"kubernetes.io/projected/c48c624c-4ecb-47d7-affb-bf5527eec659-kube-api-access-8656t\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687171 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687197 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c48c624c-4ecb-47d7-affb-bf5527eec659-metallb-excludel2\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687248 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687271 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687294 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-sockets\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlg68\" (UniqueName: \"kubernetes.io/projected/b8787de7-b1d1-41fc-bda7-628c8916c8c7-kube-api-access-tlg68\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687352 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93baaa1e-7108-463e-82c3-71abc3a34678-frr-startup\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687372 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blg9q\" (UniqueName: \"kubernetes.io/projected/93baaa1e-7108-463e-82c3-71abc3a34678-kube-api-access-blg9q\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687390 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93baaa1e-7108-463e-82c3-71abc3a34678-metrics-certs\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687427 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pmtd\" (UniqueName: \"kubernetes.io/projected/3503b383-bf2b-4c83-8a43-3323f7330880-kube-api-access-9pmtd\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687447 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-metrics\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687469 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-reloader\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.687489 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-conf\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.693894 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-b7sxc"] Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.694264 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-conf\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.694540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-metrics\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.694747 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-reloader\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.694884 4903 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.694970 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert podName:b8787de7-b1d1-41fc-bda7-628c8916c8c7 nodeName:}" failed. No retries permitted until 2025-12-02 23:11:57.194946676 +0000 UTC m=+855.903500959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert") pod "frr-k8s-webhook-server-7fcb986d4-6zx67" (UID: "b8787de7-b1d1-41fc-bda7-628c8916c8c7") : secret "frr-k8s-webhook-server-cert" not found Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.695087 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93baaa1e-7108-463e-82c3-71abc3a34678-frr-sockets\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.697412 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93baaa1e-7108-463e-82c3-71abc3a34678-frr-startup\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.710679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlg68\" (UniqueName: \"kubernetes.io/projected/b8787de7-b1d1-41fc-bda7-628c8916c8c7-kube-api-access-tlg68\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.711343 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blg9q\" (UniqueName: \"kubernetes.io/projected/93baaa1e-7108-463e-82c3-71abc3a34678-kube-api-access-blg9q\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.725375 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93baaa1e-7108-463e-82c3-71abc3a34678-metrics-certs\") pod \"frr-k8s-gh95g\" (UID: \"93baaa1e-7108-463e-82c3-71abc3a34678\") " pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.788952 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c48c624c-4ecb-47d7-affb-bf5527eec659-metallb-excludel2\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789015 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789091 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pmtd\" (UniqueName: \"kubernetes.io/projected/3503b383-bf2b-4c83-8a43-3323f7330880-kube-api-access-9pmtd\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789119 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-cert\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789136 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8656t\" (UniqueName: \"kubernetes.io/projected/c48c624c-4ecb-47d7-affb-bf5527eec659-kube-api-access-8656t\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789144 4903 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789192 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789212 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist podName:c48c624c-4ecb-47d7-affb-bf5527eec659 nodeName:}" failed. No retries permitted until 2025-12-02 23:11:57.289192864 +0000 UTC m=+855.997747147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist") pod "speaker-pm26w" (UID: "c48c624c-4ecb-47d7-affb-bf5527eec659") : secret "metallb-memberlist" not found Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789281 4903 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789323 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs podName:3503b383-bf2b-4c83-8a43-3323f7330880 nodeName:}" failed. No retries permitted until 2025-12-02 23:11:57.289308527 +0000 UTC m=+855.997862810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs") pod "controller-f8648f98b-b7sxc" (UID: "3503b383-bf2b-4c83-8a43-3323f7330880") : secret "controller-certs-secret" not found Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789369 4903 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 02 23:11:56 crc kubenswrapper[4903]: E1202 23:11:56.789402 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs podName:c48c624c-4ecb-47d7-affb-bf5527eec659 nodeName:}" failed. No retries permitted until 2025-12-02 23:11:57.289391199 +0000 UTC m=+855.997945482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs") pod "speaker-pm26w" (UID: "c48c624c-4ecb-47d7-affb-bf5527eec659") : secret "speaker-certs-secret" not found Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.789584 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c48c624c-4ecb-47d7-affb-bf5527eec659-metallb-excludel2\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.790896 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.804922 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pmtd\" (UniqueName: \"kubernetes.io/projected/3503b383-bf2b-4c83-8a43-3323f7330880-kube-api-access-9pmtd\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.805002 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-cert\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.818837 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8656t\" (UniqueName: \"kubernetes.io/projected/c48c624c-4ecb-47d7-affb-bf5527eec659-kube-api-access-8656t\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:56 crc kubenswrapper[4903]: I1202 23:11:56.909442 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.195867 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.201588 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8787de7-b1d1-41fc-bda7-628c8916c8c7-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6zx67\" (UID: \"b8787de7-b1d1-41fc-bda7-628c8916c8c7\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.230800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"a1881f87d6fe84df112139102b0290c64b7bcd3ae8d509f6e5b08dfb4eff1ab8"} Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.297558 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.297637 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.297708 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:57 crc kubenswrapper[4903]: E1202 23:11:57.297848 4903 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 23:11:57 crc kubenswrapper[4903]: E1202 23:11:57.297930 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist podName:c48c624c-4ecb-47d7-affb-bf5527eec659 nodeName:}" failed. No retries permitted until 2025-12-02 23:11:58.297888361 +0000 UTC m=+857.006442644 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist") pod "speaker-pm26w" (UID: "c48c624c-4ecb-47d7-affb-bf5527eec659") : secret "metallb-memberlist" not found Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.301083 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-metrics-certs\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.301986 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3503b383-bf2b-4c83-8a43-3323f7330880-metrics-certs\") pod \"controller-f8648f98b-b7sxc\" (UID: \"3503b383-bf2b-4c83-8a43-3323f7330880\") " pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.302756 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.500684 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.722659 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67"] Dec 02 23:11:57 crc kubenswrapper[4903]: W1202 23:11:57.729453 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8787de7_b1d1_41fc_bda7_628c8916c8c7.slice/crio-86482c20d4b9ece44d28db9a43bfc7d69ae21f4757a4e8721043a95ff88ae9b6 WatchSource:0}: Error finding container 86482c20d4b9ece44d28db9a43bfc7d69ae21f4757a4e8721043a95ff88ae9b6: Status 404 returned error can't find the container with id 86482c20d4b9ece44d28db9a43bfc7d69ae21f4757a4e8721043a95ff88ae9b6 Dec 02 23:11:57 crc kubenswrapper[4903]: I1202 23:11:57.788264 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-b7sxc"] Dec 02 23:11:57 crc kubenswrapper[4903]: W1202 23:11:57.789997 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3503b383_bf2b_4c83_8a43_3323f7330880.slice/crio-5d5d8a5aa18f23049cf74b853bc2e7cdcb4a513465ecf63a72a477c44806b766 WatchSource:0}: Error finding container 5d5d8a5aa18f23049cf74b853bc2e7cdcb4a513465ecf63a72a477c44806b766: Status 404 returned error can't find the container with id 5d5d8a5aa18f23049cf74b853bc2e7cdcb4a513465ecf63a72a477c44806b766 Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.237202 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-b7sxc" event={"ID":"3503b383-bf2b-4c83-8a43-3323f7330880","Type":"ContainerStarted","Data":"b40e81226f18763b31298c4057b69469a6d13f80fb2f14ec0924f87fc8466062"} Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.237285 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.237299 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-b7sxc" event={"ID":"3503b383-bf2b-4c83-8a43-3323f7330880","Type":"ContainerStarted","Data":"79970111c7754a210e90480a67943cedbc6e4ae44640945d0c414cb1db001995"} Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.237310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-b7sxc" event={"ID":"3503b383-bf2b-4c83-8a43-3323f7330880","Type":"ContainerStarted","Data":"5d5d8a5aa18f23049cf74b853bc2e7cdcb4a513465ecf63a72a477c44806b766"} Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.238622 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" event={"ID":"b8787de7-b1d1-41fc-bda7-628c8916c8c7","Type":"ContainerStarted","Data":"86482c20d4b9ece44d28db9a43bfc7d69ae21f4757a4e8721043a95ff88ae9b6"} Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.252729 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-b7sxc" podStartSLOduration=2.252704667 podStartE2EDuration="2.252704667s" podCreationTimestamp="2025-12-02 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:11:58.251732403 +0000 UTC m=+856.960286686" watchObservedRunningTime="2025-12-02 23:11:58.252704667 +0000 UTC m=+856.961258970" Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.309851 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.315136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c48c624c-4ecb-47d7-affb-bf5527eec659-memberlist\") pod \"speaker-pm26w\" (UID: \"c48c624c-4ecb-47d7-affb-bf5527eec659\") " pod="metallb-system/speaker-pm26w" Dec 02 23:11:58 crc kubenswrapper[4903]: I1202 23:11:58.458832 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pm26w" Dec 02 23:11:58 crc kubenswrapper[4903]: W1202 23:11:58.487460 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc48c624c_4ecb_47d7_affb_bf5527eec659.slice/crio-8238f7eafdcdb8b28446dd83f0b8f4da291ef4a730b16a44d4e8ed29243be93d WatchSource:0}: Error finding container 8238f7eafdcdb8b28446dd83f0b8f4da291ef4a730b16a44d4e8ed29243be93d: Status 404 returned error can't find the container with id 8238f7eafdcdb8b28446dd83f0b8f4da291ef4a730b16a44d4e8ed29243be93d Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.250700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pm26w" event={"ID":"c48c624c-4ecb-47d7-affb-bf5527eec659","Type":"ContainerStarted","Data":"c51039631227003abd315f1c451c4a37c34406c68beed3e4ae989291bb71bc72"} Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.250752 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pm26w" event={"ID":"c48c624c-4ecb-47d7-affb-bf5527eec659","Type":"ContainerStarted","Data":"9236a9387adaff817ade42c4ff46763ee41c691201edb51352695aa4620a34c2"} Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.250767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pm26w" event={"ID":"c48c624c-4ecb-47d7-affb-bf5527eec659","Type":"ContainerStarted","Data":"8238f7eafdcdb8b28446dd83f0b8f4da291ef4a730b16a44d4e8ed29243be93d"} Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.250961 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pm26w" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.275281 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pm26w" podStartSLOduration=3.275263497 podStartE2EDuration="3.275263497s" podCreationTimestamp="2025-12-02 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:11:59.273593257 +0000 UTC m=+857.982147550" watchObservedRunningTime="2025-12-02 23:11:59.275263497 +0000 UTC m=+857.983817780" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.440299 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.442098 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.447297 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.630449 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq95l\" (UniqueName: \"kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.630784 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.630904 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.731353 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq95l\" (UniqueName: \"kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.731414 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.731434 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.731826 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.732181 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.756487 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq95l\" (UniqueName: \"kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l\") pod \"certified-operators-mkmgs\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:11:59 crc kubenswrapper[4903]: I1202 23:11:59.773233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:00 crc kubenswrapper[4903]: I1202 23:12:00.260104 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:12:00 crc kubenswrapper[4903]: W1202 23:12:00.277949 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047ea85e_ddd3_4848_a2b6_ec6b187fbae4.slice/crio-8c88e80eda0c7b809f0f2fb3945736c05b6be04fe4fe2aa381f0a73d2fa238c5 WatchSource:0}: Error finding container 8c88e80eda0c7b809f0f2fb3945736c05b6be04fe4fe2aa381f0a73d2fa238c5: Status 404 returned error can't find the container with id 8c88e80eda0c7b809f0f2fb3945736c05b6be04fe4fe2aa381f0a73d2fa238c5 Dec 02 23:12:01 crc kubenswrapper[4903]: I1202 23:12:01.270502 4903 generic.go:334] "Generic (PLEG): container finished" podID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerID="a9a6f26d268f58acb1e50cf32d09cafbfd36c8106c640380b4744fcbcfbdecb7" exitCode=0 Dec 02 23:12:01 crc kubenswrapper[4903]: I1202 23:12:01.271095 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerDied","Data":"a9a6f26d268f58acb1e50cf32d09cafbfd36c8106c640380b4744fcbcfbdecb7"} Dec 02 23:12:01 crc kubenswrapper[4903]: I1202 23:12:01.271127 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerStarted","Data":"8c88e80eda0c7b809f0f2fb3945736c05b6be04fe4fe2aa381f0a73d2fa238c5"} Dec 02 23:12:05 crc kubenswrapper[4903]: I1202 23:12:05.297204 4903 generic.go:334] "Generic (PLEG): container finished" podID="93baaa1e-7108-463e-82c3-71abc3a34678" containerID="88c2e4bf262eb16d56b2803db4de08fe936fc07915cada3093f6a0549d8098c2" exitCode=0 Dec 02 23:12:05 crc kubenswrapper[4903]: I1202 23:12:05.297278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerDied","Data":"88c2e4bf262eb16d56b2803db4de08fe936fc07915cada3093f6a0549d8098c2"} Dec 02 23:12:05 crc kubenswrapper[4903]: I1202 23:12:05.300517 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" event={"ID":"b8787de7-b1d1-41fc-bda7-628c8916c8c7","Type":"ContainerStarted","Data":"5af410453e79c75230b5bbd622a8576be7d976643ab65aaf338ea4f158ee16ae"} Dec 02 23:12:05 crc kubenswrapper[4903]: I1202 23:12:05.300669 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:12:05 crc kubenswrapper[4903]: I1202 23:12:05.345394 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" podStartSLOduration=2.141237472 podStartE2EDuration="9.345367735s" podCreationTimestamp="2025-12-02 23:11:56 +0000 UTC" firstStartedPulling="2025-12-02 23:11:57.733053404 +0000 UTC m=+856.441607687" lastFinishedPulling="2025-12-02 23:12:04.937183667 +0000 UTC m=+863.645737950" observedRunningTime="2025-12-02 23:12:05.341119102 +0000 UTC m=+864.049673395" watchObservedRunningTime="2025-12-02 23:12:05.345367735 +0000 UTC m=+864.053922028" Dec 02 23:12:06 crc kubenswrapper[4903]: I1202 23:12:06.308518 4903 generic.go:334] "Generic (PLEG): container finished" podID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerID="945733542f6ab5e813658861289e6e469c3e4bd8abbc039a42e3d9986743d7f6" exitCode=0 Dec 02 23:12:06 crc kubenswrapper[4903]: I1202 23:12:06.308584 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerDied","Data":"945733542f6ab5e813658861289e6e469c3e4bd8abbc039a42e3d9986743d7f6"} Dec 02 23:12:06 crc kubenswrapper[4903]: I1202 23:12:06.311890 4903 generic.go:334] "Generic (PLEG): container finished" podID="93baaa1e-7108-463e-82c3-71abc3a34678" containerID="b18d06d011a9a45944859d9ccf459761dec31d37e85d56270cce648e6e37dc62" exitCode=0 Dec 02 23:12:06 crc kubenswrapper[4903]: I1202 23:12:06.311965 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerDied","Data":"b18d06d011a9a45944859d9ccf459761dec31d37e85d56270cce648e6e37dc62"} Dec 02 23:12:07 crc kubenswrapper[4903]: I1202 23:12:07.307740 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-b7sxc" Dec 02 23:12:07 crc kubenswrapper[4903]: I1202 23:12:07.320017 4903 generic.go:334] "Generic (PLEG): container finished" podID="93baaa1e-7108-463e-82c3-71abc3a34678" containerID="a7b27e8fa546dc30098d28f20ac480cb18b8fd9f2e1f69f7c267d996f42539f1" exitCode=0 Dec 02 23:12:07 crc kubenswrapper[4903]: I1202 23:12:07.320084 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerDied","Data":"a7b27e8fa546dc30098d28f20ac480cb18b8fd9f2e1f69f7c267d996f42539f1"} Dec 02 23:12:07 crc kubenswrapper[4903]: I1202 23:12:07.322674 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerStarted","Data":"851dddb599e7631c38784ef19a218da5965aeae037e25a9442c7bc3b54e68f01"} Dec 02 23:12:07 crc kubenswrapper[4903]: I1202 23:12:07.351633 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mkmgs" podStartSLOduration=6.413790616 podStartE2EDuration="8.351606212s" podCreationTimestamp="2025-12-02 23:11:59 +0000 UTC" firstStartedPulling="2025-12-02 23:12:04.820766361 +0000 UTC m=+863.529320644" lastFinishedPulling="2025-12-02 23:12:06.758581947 +0000 UTC m=+865.467136240" observedRunningTime="2025-12-02 23:12:07.349001799 +0000 UTC m=+866.057556082" watchObservedRunningTime="2025-12-02 23:12:07.351606212 +0000 UTC m=+866.060160495" Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.335903 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"ea382d49e29a22fa4ac3ad8197d1b8e055e189417dd3492ee80dca53ec4c9047"} Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.336232 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"00dd5199cd35ec45a0cff6462bd10b49e150387a5b7da0c6b72fbc30dee3044a"} Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.336253 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"fd8479691309fea9dc82580c4181925e2fee3726079fceaf99e07ec6e4d729ba"} Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.336270 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"84fa8b6995715d763e0415c7e4a811b65f8e9fef65bc28b84487d0cc229d4aea"} Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.336287 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"ca7fb53e7073cf2ba52d58b1e389ae3c544cca937e95f0301cd52d13822a9b82"} Dec 02 23:12:08 crc kubenswrapper[4903]: I1202 23:12:08.462676 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pm26w" Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.349304 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gh95g" event={"ID":"93baaa1e-7108-463e-82c3-71abc3a34678","Type":"ContainerStarted","Data":"de18abcc2dbfe4c979d5bfd2491dc912ec906d93099b5047b078b40802f8a0e7"} Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.350019 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.377164 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-gh95g" podStartSLOduration=5.491006829 podStartE2EDuration="13.377149397s" podCreationTimestamp="2025-12-02 23:11:56 +0000 UTC" firstStartedPulling="2025-12-02 23:11:57.014186085 +0000 UTC m=+855.722740368" lastFinishedPulling="2025-12-02 23:12:04.900328653 +0000 UTC m=+863.608882936" observedRunningTime="2025-12-02 23:12:09.37188219 +0000 UTC m=+868.080436473" watchObservedRunningTime="2025-12-02 23:12:09.377149397 +0000 UTC m=+868.085703680" Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.774310 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.774410 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:09 crc kubenswrapper[4903]: I1202 23:12:09.835528 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.484484 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.487754 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.501042 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b9gmn" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.501218 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.505062 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.532680 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.600168 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556qn\" (UniqueName: \"kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn\") pod \"openstack-operator-index-dbl6r\" (UID: \"f2db399f-e724-4954-9b41-3a5d9a138ea1\") " pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.701095 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556qn\" (UniqueName: \"kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn\") pod \"openstack-operator-index-dbl6r\" (UID: \"f2db399f-e724-4954-9b41-3a5d9a138ea1\") " pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.733284 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556qn\" (UniqueName: \"kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn\") pod \"openstack-operator-index-dbl6r\" (UID: \"f2db399f-e724-4954-9b41-3a5d9a138ea1\") " pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.828611 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.910203 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:12:11 crc kubenswrapper[4903]: I1202 23:12:11.960220 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:12:12 crc kubenswrapper[4903]: I1202 23:12:12.045878 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:12 crc kubenswrapper[4903]: W1202 23:12:12.054466 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2db399f_e724_4954_9b41_3a5d9a138ea1.slice/crio-3666ea12d91c1bcafc32f64fd5b1f72102f8ec4c7e5816c4c3057c84575b7289 WatchSource:0}: Error finding container 3666ea12d91c1bcafc32f64fd5b1f72102f8ec4c7e5816c4c3057c84575b7289: Status 404 returned error can't find the container with id 3666ea12d91c1bcafc32f64fd5b1f72102f8ec4c7e5816c4c3057c84575b7289 Dec 02 23:12:12 crc kubenswrapper[4903]: I1202 23:12:12.372218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbl6r" event={"ID":"f2db399f-e724-4954-9b41-3a5d9a138ea1","Type":"ContainerStarted","Data":"3666ea12d91c1bcafc32f64fd5b1f72102f8ec4c7e5816c4c3057c84575b7289"} Dec 02 23:12:14 crc kubenswrapper[4903]: I1202 23:12:14.855568 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.394292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbl6r" event={"ID":"f2db399f-e724-4954-9b41-3a5d9a138ea1","Type":"ContainerStarted","Data":"4a2d745758e8874a39890fcb66a1d35c1c9db678208c820036f127d871e08eb6"} Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.394476 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dbl6r" podUID="f2db399f-e724-4954-9b41-3a5d9a138ea1" containerName="registry-server" containerID="cri-o://4a2d745758e8874a39890fcb66a1d35c1c9db678208c820036f127d871e08eb6" gracePeriod=2 Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.420125 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dbl6r" podStartSLOduration=1.746467198 podStartE2EDuration="4.420108614s" podCreationTimestamp="2025-12-02 23:12:11 +0000 UTC" firstStartedPulling="2025-12-02 23:12:12.056183823 +0000 UTC m=+870.764738106" lastFinishedPulling="2025-12-02 23:12:14.729825239 +0000 UTC m=+873.438379522" observedRunningTime="2025-12-02 23:12:15.419542911 +0000 UTC m=+874.128097194" watchObservedRunningTime="2025-12-02 23:12:15.420108614 +0000 UTC m=+874.128662897" Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.461243 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-scz9b"] Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.462008 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.484855 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scz9b"] Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.562458 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpls\" (UniqueName: \"kubernetes.io/projected/1e4a5768-36c7-4a71-8bf1-57f9ff69b940-kube-api-access-dzpls\") pod \"openstack-operator-index-scz9b\" (UID: \"1e4a5768-36c7-4a71-8bf1-57f9ff69b940\") " pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.664493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpls\" (UniqueName: \"kubernetes.io/projected/1e4a5768-36c7-4a71-8bf1-57f9ff69b940-kube-api-access-dzpls\") pod \"openstack-operator-index-scz9b\" (UID: \"1e4a5768-36c7-4a71-8bf1-57f9ff69b940\") " pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.687507 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpls\" (UniqueName: \"kubernetes.io/projected/1e4a5768-36c7-4a71-8bf1-57f9ff69b940-kube-api-access-dzpls\") pod \"openstack-operator-index-scz9b\" (UID: \"1e4a5768-36c7-4a71-8bf1-57f9ff69b940\") " pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:15 crc kubenswrapper[4903]: I1202 23:12:15.774688 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:16 crc kubenswrapper[4903]: I1202 23:12:16.009799 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scz9b"] Dec 02 23:12:16 crc kubenswrapper[4903]: W1202 23:12:16.027052 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e4a5768_36c7_4a71_8bf1_57f9ff69b940.slice/crio-2fb113d743898e300fff41e99de380c9d02900f805831aa71241b1f925c11cc9 WatchSource:0}: Error finding container 2fb113d743898e300fff41e99de380c9d02900f805831aa71241b1f925c11cc9: Status 404 returned error can't find the container with id 2fb113d743898e300fff41e99de380c9d02900f805831aa71241b1f925c11cc9 Dec 02 23:12:16 crc kubenswrapper[4903]: I1202 23:12:16.405017 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scz9b" event={"ID":"1e4a5768-36c7-4a71-8bf1-57f9ff69b940","Type":"ContainerStarted","Data":"2fb113d743898e300fff41e99de380c9d02900f805831aa71241b1f925c11cc9"} Dec 02 23:12:16 crc kubenswrapper[4903]: I1202 23:12:16.407637 4903 generic.go:334] "Generic (PLEG): container finished" podID="f2db399f-e724-4954-9b41-3a5d9a138ea1" containerID="4a2d745758e8874a39890fcb66a1d35c1c9db678208c820036f127d871e08eb6" exitCode=0 Dec 02 23:12:16 crc kubenswrapper[4903]: I1202 23:12:16.407720 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbl6r" event={"ID":"f2db399f-e724-4954-9b41-3a5d9a138ea1","Type":"ContainerDied","Data":"4a2d745758e8874a39890fcb66a1d35c1c9db678208c820036f127d871e08eb6"} Dec 02 23:12:16 crc kubenswrapper[4903]: I1202 23:12:16.990633 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.184100 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556qn\" (UniqueName: \"kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn\") pod \"f2db399f-e724-4954-9b41-3a5d9a138ea1\" (UID: \"f2db399f-e724-4954-9b41-3a5d9a138ea1\") " Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.193986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn" (OuterVolumeSpecName: "kube-api-access-556qn") pod "f2db399f-e724-4954-9b41-3a5d9a138ea1" (UID: "f2db399f-e724-4954-9b41-3a5d9a138ea1"). InnerVolumeSpecName "kube-api-access-556qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.285841 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556qn\" (UniqueName: \"kubernetes.io/projected/f2db399f-e724-4954-9b41-3a5d9a138ea1-kube-api-access-556qn\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.417285 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbl6r" event={"ID":"f2db399f-e724-4954-9b41-3a5d9a138ea1","Type":"ContainerDied","Data":"3666ea12d91c1bcafc32f64fd5b1f72102f8ec4c7e5816c4c3057c84575b7289"} Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.417330 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbl6r" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.417369 4903 scope.go:117] "RemoveContainer" containerID="4a2d745758e8874a39890fcb66a1d35c1c9db678208c820036f127d871e08eb6" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.423129 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scz9b" event={"ID":"1e4a5768-36c7-4a71-8bf1-57f9ff69b940","Type":"ContainerStarted","Data":"c6e62947326d8950e3eb401e1b2e3810bb2337beea253e9e091c1c805b40b17a"} Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.452409 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-scz9b" podStartSLOduration=1.7122957699999999 podStartE2EDuration="2.452379193s" podCreationTimestamp="2025-12-02 23:12:15 +0000 UTC" firstStartedPulling="2025-12-02 23:12:16.032878778 +0000 UTC m=+874.741433061" lastFinishedPulling="2025-12-02 23:12:16.772962201 +0000 UTC m=+875.481516484" observedRunningTime="2025-12-02 23:12:17.444125863 +0000 UTC m=+876.152680186" watchObservedRunningTime="2025-12-02 23:12:17.452379193 +0000 UTC m=+876.160933516" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.474092 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.484777 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dbl6r"] Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.504900 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6zx67" Dec 02 23:12:17 crc kubenswrapper[4903]: I1202 23:12:17.621969 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2db399f-e724-4954-9b41-3a5d9a138ea1" path="/var/lib/kubelet/pods/f2db399f-e724-4954-9b41-3a5d9a138ea1/volumes" Dec 02 23:12:19 crc kubenswrapper[4903]: I1202 23:12:19.834810 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:23 crc kubenswrapper[4903]: I1202 23:12:23.110292 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:12:23 crc kubenswrapper[4903]: I1202 23:12:23.111007 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mkmgs" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="registry-server" containerID="cri-o://851dddb599e7631c38784ef19a218da5965aeae037e25a9442c7bc3b54e68f01" gracePeriod=2 Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.476881 4903 generic.go:334] "Generic (PLEG): container finished" podID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerID="851dddb599e7631c38784ef19a218da5965aeae037e25a9442c7bc3b54e68f01" exitCode=0 Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.476967 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerDied","Data":"851dddb599e7631c38784ef19a218da5965aeae037e25a9442c7bc3b54e68f01"} Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.657695 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.789764 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities\") pod \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.789918 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content\") pod \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.790083 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq95l\" (UniqueName: \"kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l\") pod \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\" (UID: \"047ea85e-ddd3-4848-a2b6-ec6b187fbae4\") " Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.794638 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities" (OuterVolumeSpecName: "utilities") pod "047ea85e-ddd3-4848-a2b6-ec6b187fbae4" (UID: "047ea85e-ddd3-4848-a2b6-ec6b187fbae4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.801481 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l" (OuterVolumeSpecName: "kube-api-access-jq95l") pod "047ea85e-ddd3-4848-a2b6-ec6b187fbae4" (UID: "047ea85e-ddd3-4848-a2b6-ec6b187fbae4"). InnerVolumeSpecName "kube-api-access-jq95l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.870237 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "047ea85e-ddd3-4848-a2b6-ec6b187fbae4" (UID: "047ea85e-ddd3-4848-a2b6-ec6b187fbae4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.891814 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq95l\" (UniqueName: \"kubernetes.io/projected/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-kube-api-access-jq95l\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.891917 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:24 crc kubenswrapper[4903]: I1202 23:12:24.891938 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047ea85e-ddd3-4848-a2b6-ec6b187fbae4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.488491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkmgs" event={"ID":"047ea85e-ddd3-4848-a2b6-ec6b187fbae4","Type":"ContainerDied","Data":"8c88e80eda0c7b809f0f2fb3945736c05b6be04fe4fe2aa381f0a73d2fa238c5"} Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.488577 4903 scope.go:117] "RemoveContainer" containerID="851dddb599e7631c38784ef19a218da5965aeae037e25a9442c7bc3b54e68f01" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.488585 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkmgs" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.530000 4903 scope.go:117] "RemoveContainer" containerID="945733542f6ab5e813658861289e6e469c3e4bd8abbc039a42e3d9986743d7f6" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.559574 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.568291 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mkmgs"] Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.569582 4903 scope.go:117] "RemoveContainer" containerID="a9a6f26d268f58acb1e50cf32d09cafbfd36c8106c640380b4744fcbcfbdecb7" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.628095 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" path="/var/lib/kubelet/pods/047ea85e-ddd3-4848-a2b6-ec6b187fbae4/volumes" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.775524 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.775615 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:25 crc kubenswrapper[4903]: I1202 23:12:25.806997 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:26 crc kubenswrapper[4903]: I1202 23:12:26.535094 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-scz9b" Dec 02 23:12:26 crc kubenswrapper[4903]: I1202 23:12:26.920925 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-gh95g" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.467255 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:32 crc kubenswrapper[4903]: E1202 23:12:32.467831 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2db399f-e724-4954-9b41-3a5d9a138ea1" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.467848 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2db399f-e724-4954-9b41-3a5d9a138ea1" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: E1202 23:12:32.467868 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="extract-utilities" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.467878 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="extract-utilities" Dec 02 23:12:32 crc kubenswrapper[4903]: E1202 23:12:32.467888 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="extract-content" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.467897 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="extract-content" Dec 02 23:12:32 crc kubenswrapper[4903]: E1202 23:12:32.467910 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.467918 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.468051 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="047ea85e-ddd3-4848-a2b6-ec6b187fbae4" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.468065 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2db399f-e724-4954-9b41-3a5d9a138ea1" containerName="registry-server" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.469025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.483171 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.611342 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2js\" (UniqueName: \"kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.611405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.611431 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.713086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2js\" (UniqueName: \"kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.713138 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.713157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.713716 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.714118 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.736139 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2js\" (UniqueName: \"kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js\") pod \"redhat-marketplace-g7pmq\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:32 crc kubenswrapper[4903]: I1202 23:12:32.786575 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:33 crc kubenswrapper[4903]: W1202 23:12:33.205076 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5575774_80b5_4783_ba77_6374a11c7944.slice/crio-85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62 WatchSource:0}: Error finding container 85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62: Status 404 returned error can't find the container with id 85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62 Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.205203 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.504037 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb"] Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.505113 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.507026 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bkccr" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.522227 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb"] Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.544339 4903 generic.go:334] "Generic (PLEG): container finished" podID="c5575774-80b5-4783-ba77-6374a11c7944" containerID="14766c5cdfa950876a4f9c4c65f33f602772484518b0498b89dc86f38a53d718" exitCode=0 Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.544380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerDied","Data":"14766c5cdfa950876a4f9c4c65f33f602772484518b0498b89dc86f38a53d718"} Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.544401 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerStarted","Data":"85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62"} Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.624103 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.624141 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.624198 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjg7t\" (UniqueName: \"kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.725949 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.726027 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.726163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjg7t\" (UniqueName: \"kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.726924 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.726980 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.752197 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjg7t\" (UniqueName: \"kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t\") pod \"d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:33 crc kubenswrapper[4903]: I1202 23:12:33.821478 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.083915 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb"] Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.549997 4903 generic.go:334] "Generic (PLEG): container finished" podID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerID="76e053c9a00b55b193cb505c904359b8a9385460519bdceabe5f0dd0cc932711" exitCode=0 Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.550096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" event={"ID":"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d","Type":"ContainerDied","Data":"76e053c9a00b55b193cb505c904359b8a9385460519bdceabe5f0dd0cc932711"} Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.550253 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" event={"ID":"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d","Type":"ContainerStarted","Data":"8f99f6f61d9ec9d02d623679b0cba0066a0bf00bdf14f3b066f9ee577a12c232"} Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.552555 4903 generic.go:334] "Generic (PLEG): container finished" podID="c5575774-80b5-4783-ba77-6374a11c7944" containerID="bc0a845a0b1a0efddd676e89403deb3ed49f74a5ec764b902525e7b0f427149f" exitCode=0 Dec 02 23:12:34 crc kubenswrapper[4903]: I1202 23:12:34.552575 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerDied","Data":"bc0a845a0b1a0efddd676e89403deb3ed49f74a5ec764b902525e7b0f427149f"} Dec 02 23:12:35 crc kubenswrapper[4903]: I1202 23:12:35.560599 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerStarted","Data":"993d44269a3a21582fe408533087b5f9c2a35bd1f82d2ef37b1a771bf1890cf4"} Dec 02 23:12:35 crc kubenswrapper[4903]: I1202 23:12:35.562521 4903 generic.go:334] "Generic (PLEG): container finished" podID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerID="f227856ac3f79e4be2d40907b3e7477de22f7b4be6144779750a6f9202efaf37" exitCode=0 Dec 02 23:12:35 crc kubenswrapper[4903]: I1202 23:12:35.562552 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" event={"ID":"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d","Type":"ContainerDied","Data":"f227856ac3f79e4be2d40907b3e7477de22f7b4be6144779750a6f9202efaf37"} Dec 02 23:12:35 crc kubenswrapper[4903]: I1202 23:12:35.610509 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g7pmq" podStartSLOduration=2.08377129 podStartE2EDuration="3.610491538s" podCreationTimestamp="2025-12-02 23:12:32 +0000 UTC" firstStartedPulling="2025-12-02 23:12:33.546150871 +0000 UTC m=+892.254705154" lastFinishedPulling="2025-12-02 23:12:35.072871079 +0000 UTC m=+893.781425402" observedRunningTime="2025-12-02 23:12:35.582941689 +0000 UTC m=+894.291495992" watchObservedRunningTime="2025-12-02 23:12:35.610491538 +0000 UTC m=+894.319045821" Dec 02 23:12:36 crc kubenswrapper[4903]: I1202 23:12:36.573164 4903 generic.go:334] "Generic (PLEG): container finished" podID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerID="7a868e11152c8007c460a286749628b16fb743209369d8de802067d9509a062c" exitCode=0 Dec 02 23:12:36 crc kubenswrapper[4903]: I1202 23:12:36.573365 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" event={"ID":"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d","Type":"ContainerDied","Data":"7a868e11152c8007c460a286749628b16fb743209369d8de802067d9509a062c"} Dec 02 23:12:37 crc kubenswrapper[4903]: I1202 23:12:37.976366 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.088086 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjg7t\" (UniqueName: \"kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t\") pod \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.088148 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util\") pod \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.088187 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle\") pod \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\" (UID: \"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d\") " Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.088962 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle" (OuterVolumeSpecName: "bundle") pod "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" (UID: "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.093738 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t" (OuterVolumeSpecName: "kube-api-access-tjg7t") pod "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" (UID: "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d"). InnerVolumeSpecName "kube-api-access-tjg7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.122264 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util" (OuterVolumeSpecName: "util") pod "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" (UID: "0515602c-3cfc-4fe8-99ec-8a3d18e8f88d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.189083 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjg7t\" (UniqueName: \"kubernetes.io/projected/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-kube-api-access-tjg7t\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.189112 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-util\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.189121 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0515602c-3cfc-4fe8-99ec-8a3d18e8f88d-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.593233 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" event={"ID":"0515602c-3cfc-4fe8-99ec-8a3d18e8f88d","Type":"ContainerDied","Data":"8f99f6f61d9ec9d02d623679b0cba0066a0bf00bdf14f3b066f9ee577a12c232"} Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.593322 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f99f6f61d9ec9d02d623679b0cba0066a0bf00bdf14f3b066f9ee577a12c232" Dec 02 23:12:38 crc kubenswrapper[4903]: I1202 23:12:38.593282 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.285720 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw"] Dec 02 23:12:41 crc kubenswrapper[4903]: E1202 23:12:41.286310 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="util" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.286325 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="util" Dec 02 23:12:41 crc kubenswrapper[4903]: E1202 23:12:41.286347 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="pull" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.286355 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="pull" Dec 02 23:12:41 crc kubenswrapper[4903]: E1202 23:12:41.286367 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="extract" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.286375 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="extract" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.286500 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0515602c-3cfc-4fe8-99ec-8a3d18e8f88d" containerName="extract" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.287025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.289687 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2mqmw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.331976 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw"] Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.436107 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cx9p\" (UniqueName: \"kubernetes.io/projected/de3babfe-054a-424f-8b40-e4e43d5f3e5b-kube-api-access-4cx9p\") pod \"openstack-operator-controller-operator-756b77799f-tcscw\" (UID: \"de3babfe-054a-424f-8b40-e4e43d5f3e5b\") " pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.537398 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cx9p\" (UniqueName: \"kubernetes.io/projected/de3babfe-054a-424f-8b40-e4e43d5f3e5b-kube-api-access-4cx9p\") pod \"openstack-operator-controller-operator-756b77799f-tcscw\" (UID: \"de3babfe-054a-424f-8b40-e4e43d5f3e5b\") " pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.556467 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cx9p\" (UniqueName: \"kubernetes.io/projected/de3babfe-054a-424f-8b40-e4e43d5f3e5b-kube-api-access-4cx9p\") pod \"openstack-operator-controller-operator-756b77799f-tcscw\" (UID: \"de3babfe-054a-424f-8b40-e4e43d5f3e5b\") " pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.627676 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2mqmw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.632965 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:41 crc kubenswrapper[4903]: I1202 23:12:41.936775 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw"] Dec 02 23:12:41 crc kubenswrapper[4903]: W1202 23:12:41.945335 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3babfe_054a_424f_8b40_e4e43d5f3e5b.slice/crio-bc14455e7a7b102a696e572004f2e73d0ba75104953097aa46d2ed984fc1c12a WatchSource:0}: Error finding container bc14455e7a7b102a696e572004f2e73d0ba75104953097aa46d2ed984fc1c12a: Status 404 returned error can't find the container with id bc14455e7a7b102a696e572004f2e73d0ba75104953097aa46d2ed984fc1c12a Dec 02 23:12:42 crc kubenswrapper[4903]: I1202 23:12:42.617790 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" event={"ID":"de3babfe-054a-424f-8b40-e4e43d5f3e5b","Type":"ContainerStarted","Data":"bc14455e7a7b102a696e572004f2e73d0ba75104953097aa46d2ed984fc1c12a"} Dec 02 23:12:42 crc kubenswrapper[4903]: I1202 23:12:42.786801 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:42 crc kubenswrapper[4903]: I1202 23:12:42.787440 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:42 crc kubenswrapper[4903]: I1202 23:12:42.840437 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:43 crc kubenswrapper[4903]: I1202 23:12:43.673275 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.058134 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.059605 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.072435 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.126464 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.126617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.126772 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2ljp\" (UniqueName: \"kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.228687 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.228780 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2ljp\" (UniqueName: \"kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.228873 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.229188 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.229595 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.248429 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2ljp\" (UniqueName: \"kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp\") pod \"community-operators-vlwgx\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:45 crc kubenswrapper[4903]: I1202 23:12:45.384998 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:46 crc kubenswrapper[4903]: I1202 23:12:46.252351 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:46 crc kubenswrapper[4903]: I1202 23:12:46.645317 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g7pmq" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="registry-server" containerID="cri-o://993d44269a3a21582fe408533087b5f9c2a35bd1f82d2ef37b1a771bf1890cf4" gracePeriod=2 Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.655975 4903 generic.go:334] "Generic (PLEG): container finished" podID="c5575774-80b5-4783-ba77-6374a11c7944" containerID="993d44269a3a21582fe408533087b5f9c2a35bd1f82d2ef37b1a771bf1890cf4" exitCode=0 Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.656003 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerDied","Data":"993d44269a3a21582fe408533087b5f9c2a35bd1f82d2ef37b1a771bf1890cf4"} Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.656299 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g7pmq" event={"ID":"c5575774-80b5-4783-ba77-6374a11c7944","Type":"ContainerDied","Data":"85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62"} Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.656320 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85585cba5ffbbdb29f1c46a4303768303615e4a2ec65cafa67d0e471aba9ed62" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.680292 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.766400 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities\") pod \"c5575774-80b5-4783-ba77-6374a11c7944\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.766554 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z2js\" (UniqueName: \"kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js\") pod \"c5575774-80b5-4783-ba77-6374a11c7944\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.766687 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content\") pod \"c5575774-80b5-4783-ba77-6374a11c7944\" (UID: \"c5575774-80b5-4783-ba77-6374a11c7944\") " Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.768441 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities" (OuterVolumeSpecName: "utilities") pod "c5575774-80b5-4783-ba77-6374a11c7944" (UID: "c5575774-80b5-4783-ba77-6374a11c7944"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.778067 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js" (OuterVolumeSpecName: "kube-api-access-4z2js") pod "c5575774-80b5-4783-ba77-6374a11c7944" (UID: "c5575774-80b5-4783-ba77-6374a11c7944"). InnerVolumeSpecName "kube-api-access-4z2js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.789958 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5575774-80b5-4783-ba77-6374a11c7944" (UID: "c5575774-80b5-4783-ba77-6374a11c7944"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.868039 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.868357 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z2js\" (UniqueName: \"kubernetes.io/projected/c5575774-80b5-4783-ba77-6374a11c7944-kube-api-access-4z2js\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:47 crc kubenswrapper[4903]: I1202 23:12:47.868369 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5575774-80b5-4783-ba77-6374a11c7944-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.052337 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.665042 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" event={"ID":"de3babfe-054a-424f-8b40-e4e43d5f3e5b","Type":"ContainerStarted","Data":"27f17a6251e6472b7c3f7d3802d55da998e1e970a6fff2f8f65dd5bdc23f2318"} Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.667143 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerID="46f04c12f3f536cca3af18dd2a21972311556465ff202bd1f8d295d073b6aa8f" exitCode=0 Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.667196 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerDied","Data":"46f04c12f3f536cca3af18dd2a21972311556465ff202bd1f8d295d073b6aa8f"} Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.667230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerStarted","Data":"c9b7ab95ba8589a760270e216ba3a78b82d71b10b6e33bc280da7b887cec9d51"} Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.667276 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g7pmq" Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.731899 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" podStartSLOduration=2.037227314 podStartE2EDuration="7.731873309s" podCreationTimestamp="2025-12-02 23:12:41 +0000 UTC" firstStartedPulling="2025-12-02 23:12:41.948548589 +0000 UTC m=+900.657102872" lastFinishedPulling="2025-12-02 23:12:47.643194574 +0000 UTC m=+906.351748867" observedRunningTime="2025-12-02 23:12:48.718564676 +0000 UTC m=+907.427118999" watchObservedRunningTime="2025-12-02 23:12:48.731873309 +0000 UTC m=+907.440427622" Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.745381 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:48 crc kubenswrapper[4903]: I1202 23:12:48.755001 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g7pmq"] Dec 02 23:12:49 crc kubenswrapper[4903]: I1202 23:12:49.625371 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5575774-80b5-4783-ba77-6374a11c7944" path="/var/lib/kubelet/pods/c5575774-80b5-4783-ba77-6374a11c7944/volumes" Dec 02 23:12:49 crc kubenswrapper[4903]: I1202 23:12:49.677972 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerStarted","Data":"7bd93339abc34a32b35e45a92a0051e22d7197e73917df8a8ab7f2ff5f0a0581"} Dec 02 23:12:49 crc kubenswrapper[4903]: I1202 23:12:49.678017 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:12:50 crc kubenswrapper[4903]: I1202 23:12:50.689446 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerID="7bd93339abc34a32b35e45a92a0051e22d7197e73917df8a8ab7f2ff5f0a0581" exitCode=0 Dec 02 23:12:50 crc kubenswrapper[4903]: I1202 23:12:50.689520 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerDied","Data":"7bd93339abc34a32b35e45a92a0051e22d7197e73917df8a8ab7f2ff5f0a0581"} Dec 02 23:12:52 crc kubenswrapper[4903]: I1202 23:12:52.711459 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerStarted","Data":"8c5cfb83a5272ae46da9b4411f0d001ac159465e94983681887e5ed948143225"} Dec 02 23:12:52 crc kubenswrapper[4903]: I1202 23:12:52.732137 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vlwgx" podStartSLOduration=5.245462841 podStartE2EDuration="7.73212274s" podCreationTimestamp="2025-12-02 23:12:45 +0000 UTC" firstStartedPulling="2025-12-02 23:12:48.669205508 +0000 UTC m=+907.377759831" lastFinishedPulling="2025-12-02 23:12:51.155865447 +0000 UTC m=+909.864419730" observedRunningTime="2025-12-02 23:12:52.727907616 +0000 UTC m=+911.436461919" watchObservedRunningTime="2025-12-02 23:12:52.73212274 +0000 UTC m=+911.440677033" Dec 02 23:12:53 crc kubenswrapper[4903]: I1202 23:12:53.070483 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:12:53 crc kubenswrapper[4903]: I1202 23:12:53.070547 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:12:55 crc kubenswrapper[4903]: I1202 23:12:55.386152 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:55 crc kubenswrapper[4903]: I1202 23:12:55.386485 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:12:55 crc kubenswrapper[4903]: I1202 23:12:55.471432 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:13:01 crc kubenswrapper[4903]: I1202 23:13:01.638351 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-756b77799f-tcscw" Dec 02 23:13:05 crc kubenswrapper[4903]: I1202 23:13:05.431845 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:13:05 crc kubenswrapper[4903]: I1202 23:13:05.480763 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:13:05 crc kubenswrapper[4903]: I1202 23:13:05.819718 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vlwgx" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="registry-server" containerID="cri-o://8c5cfb83a5272ae46da9b4411f0d001ac159465e94983681887e5ed948143225" gracePeriod=2 Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.828249 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerID="8c5cfb83a5272ae46da9b4411f0d001ac159465e94983681887e5ed948143225" exitCode=0 Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.828299 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerDied","Data":"8c5cfb83a5272ae46da9b4411f0d001ac159465e94983681887e5ed948143225"} Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.828336 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlwgx" event={"ID":"a7c9cd1b-9302-45f8-b143-9b8766a23dec","Type":"ContainerDied","Data":"c9b7ab95ba8589a760270e216ba3a78b82d71b10b6e33bc280da7b887cec9d51"} Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.828357 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b7ab95ba8589a760270e216ba3a78b82d71b10b6e33bc280da7b887cec9d51" Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.828669 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.954686 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2ljp\" (UniqueName: \"kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp\") pod \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.954992 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities\") pod \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.955045 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content\") pod \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\" (UID: \"a7c9cd1b-9302-45f8-b143-9b8766a23dec\") " Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.955828 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities" (OuterVolumeSpecName: "utilities") pod "a7c9cd1b-9302-45f8-b143-9b8766a23dec" (UID: "a7c9cd1b-9302-45f8-b143-9b8766a23dec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:13:06 crc kubenswrapper[4903]: I1202 23:13:06.960442 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp" (OuterVolumeSpecName: "kube-api-access-v2ljp") pod "a7c9cd1b-9302-45f8-b143-9b8766a23dec" (UID: "a7c9cd1b-9302-45f8-b143-9b8766a23dec"). InnerVolumeSpecName "kube-api-access-v2ljp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.026877 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7c9cd1b-9302-45f8-b143-9b8766a23dec" (UID: "a7c9cd1b-9302-45f8-b143-9b8766a23dec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.056150 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.056197 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c9cd1b-9302-45f8-b143-9b8766a23dec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.056214 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2ljp\" (UniqueName: \"kubernetes.io/projected/a7c9cd1b-9302-45f8-b143-9b8766a23dec-kube-api-access-v2ljp\") on node \"crc\" DevicePath \"\"" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.832671 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlwgx" Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.853668 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:13:07 crc kubenswrapper[4903]: I1202 23:13:07.861146 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vlwgx"] Dec 02 23:13:09 crc kubenswrapper[4903]: I1202 23:13:09.621541 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" path="/var/lib/kubelet/pods/a7c9cd1b-9302-45f8-b143-9b8766a23dec/volumes" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.718275 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r"] Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719031 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719043 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719051 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="extract-content" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719057 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="extract-content" Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719063 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="extract-utilities" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719069 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="extract-utilities" Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719091 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719097 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719107 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="extract-utilities" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719112 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="extract-utilities" Dec 02 23:13:20 crc kubenswrapper[4903]: E1202 23:13:20.719130 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="extract-content" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719136 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="extract-content" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719293 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5575774-80b5-4783-ba77-6374a11c7944" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.719329 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c9cd1b-9302-45f8-b143-9b8766a23dec" containerName="registry-server" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.720071 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.721828 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jbds5" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.729254 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.730260 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.736740 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mtdvw" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.740517 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.746251 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.747247 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.749267 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vvs4p" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.750879 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.763181 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbtl\" (UniqueName: \"kubernetes.io/projected/35bd5361-6683-4c7d-b26c-3cac8e7a5bf4-kube-api-access-4pbtl\") pod \"cinder-operator-controller-manager-859b6ccc6-j2zhw\" (UID: \"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.763227 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzc4b\" (UniqueName: \"kubernetes.io/projected/8f5feda5-281a-4c4f-be95-7b96ecc273f9-kube-api-access-mzc4b\") pod \"designate-operator-controller-manager-78b4bc895b-ddkpk\" (UID: \"8f5feda5-281a-4c4f-be95-7b96ecc273f9\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.763433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqps9\" (UniqueName: \"kubernetes.io/projected/58ddb811-8791-4420-ae35-b3521289b565-kube-api-access-rqps9\") pod \"barbican-operator-controller-manager-7d9dfd778-tsj9r\" (UID: \"58ddb811-8791-4420-ae35-b3521289b565\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.789723 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.808708 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.809716 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.809780 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.813998 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-sbzqr" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.820705 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.821629 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.824552 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-c5sj6" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.854991 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.865244 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbtl\" (UniqueName: \"kubernetes.io/projected/35bd5361-6683-4c7d-b26c-3cac8e7a5bf4-kube-api-access-4pbtl\") pod \"cinder-operator-controller-manager-859b6ccc6-j2zhw\" (UID: \"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.865292 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzc4b\" (UniqueName: \"kubernetes.io/projected/8f5feda5-281a-4c4f-be95-7b96ecc273f9-kube-api-access-mzc4b\") pod \"designate-operator-controller-manager-78b4bc895b-ddkpk\" (UID: \"8f5feda5-281a-4c4f-be95-7b96ecc273f9\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.865369 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqps9\" (UniqueName: \"kubernetes.io/projected/58ddb811-8791-4420-ae35-b3521289b565-kube-api-access-rqps9\") pod \"barbican-operator-controller-manager-7d9dfd778-tsj9r\" (UID: \"58ddb811-8791-4420-ae35-b3521289b565\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.902541 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqps9\" (UniqueName: \"kubernetes.io/projected/58ddb811-8791-4420-ae35-b3521289b565-kube-api-access-rqps9\") pod \"barbican-operator-controller-manager-7d9dfd778-tsj9r\" (UID: \"58ddb811-8791-4420-ae35-b3521289b565\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.913056 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2"] Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.922318 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbtl\" (UniqueName: \"kubernetes.io/projected/35bd5361-6683-4c7d-b26c-3cac8e7a5bf4-kube-api-access-4pbtl\") pod \"cinder-operator-controller-manager-859b6ccc6-j2zhw\" (UID: \"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.929663 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzc4b\" (UniqueName: \"kubernetes.io/projected/8f5feda5-281a-4c4f-be95-7b96ecc273f9-kube-api-access-mzc4b\") pod \"designate-operator-controller-manager-78b4bc895b-ddkpk\" (UID: \"8f5feda5-281a-4c4f-be95-7b96ecc273f9\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.948589 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.958367 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-c96qx" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.988734 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4sfl\" (UniqueName: \"kubernetes.io/projected/d0be2ea9-978d-4c79-a623-3b752547d546-kube-api-access-x4sfl\") pod \"glance-operator-controller-manager-77987cd8cd-78vhb\" (UID: \"d0be2ea9-978d-4c79-a623-3b752547d546\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.991336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984p7\" (UniqueName: \"kubernetes.io/projected/5046b326-aad3-4aa9-ad84-96b3943a6147-kube-api-access-984p7\") pod \"heat-operator-controller-manager-5f64f6f8bb-6dsjr\" (UID: \"5046b326-aad3-4aa9-ad84-96b3943a6147\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:20 crc kubenswrapper[4903]: I1202 23:13:20.994831 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.005341 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.006394 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.012378 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zwr72" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.012435 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.013602 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.014695 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.016100 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-k9gxj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.037086 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.045020 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.052705 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.054262 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.057152 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4d5nj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.057907 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.064708 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.064990 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.074632 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.088767 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.092107 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.104492 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-64x7h" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.104623 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.105351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4sfl\" (UniqueName: \"kubernetes.io/projected/d0be2ea9-978d-4c79-a623-3b752547d546-kube-api-access-x4sfl\") pod \"glance-operator-controller-manager-77987cd8cd-78vhb\" (UID: \"d0be2ea9-978d-4c79-a623-3b752547d546\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.105447 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsd5n\" (UniqueName: \"kubernetes.io/projected/5c4ccdc6-6205-4108-9146-75a7a963732e-kube-api-access-vsd5n\") pod \"keystone-operator-controller-manager-7765d96ddf-s97rj\" (UID: \"5c4ccdc6-6205-4108-9146-75a7a963732e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.105478 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984p7\" (UniqueName: \"kubernetes.io/projected/5046b326-aad3-4aa9-ad84-96b3943a6147-kube-api-access-984p7\") pod \"heat-operator-controller-manager-5f64f6f8bb-6dsjr\" (UID: \"5046b326-aad3-4aa9-ad84-96b3943a6147\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.105556 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2f8\" (UniqueName: \"kubernetes.io/projected/d3c55b89-b070-410d-8436-a101b0f313cf-kube-api-access-5m2f8\") pod \"horizon-operator-controller-manager-68c6d99b8f-lwgx2\" (UID: \"d3c55b89-b070-410d-8436-a101b0f313cf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.107706 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.108831 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.117841 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-w6g92" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.134404 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.141959 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984p7\" (UniqueName: \"kubernetes.io/projected/5046b326-aad3-4aa9-ad84-96b3943a6147-kube-api-access-984p7\") pod \"heat-operator-controller-manager-5f64f6f8bb-6dsjr\" (UID: \"5046b326-aad3-4aa9-ad84-96b3943a6147\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.146457 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4sfl\" (UniqueName: \"kubernetes.io/projected/d0be2ea9-978d-4c79-a623-3b752547d546-kube-api-access-x4sfl\") pod \"glance-operator-controller-manager-77987cd8cd-78vhb\" (UID: \"d0be2ea9-978d-4c79-a623-3b752547d546\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.149718 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.150874 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.154283 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-r92hr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.159021 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-phs84"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.159913 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.164837 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-phs84"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.164955 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-n7cn9" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.166033 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.173047 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.201820 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.208164 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhhq\" (UniqueName: \"kubernetes.io/projected/7c596dd6-5f26-4bb7-a771-8c1d57129209-kube-api-access-bkhhq\") pod \"manila-operator-controller-manager-7c79b5df47-xhm6n\" (UID: \"7c596dd6-5f26-4bb7-a771-8c1d57129209\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.208219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqqj7\" (UniqueName: \"kubernetes.io/projected/e3082dc8-ebbf-4a01-9120-5f1081af7801-kube-api-access-xqqj7\") pod \"ironic-operator-controller-manager-6c548fd776-hj5mh\" (UID: \"e3082dc8-ebbf-4a01-9120-5f1081af7801\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210712 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsd5n\" (UniqueName: \"kubernetes.io/projected/5c4ccdc6-6205-4108-9146-75a7a963732e-kube-api-access-vsd5n\") pod \"keystone-operator-controller-manager-7765d96ddf-s97rj\" (UID: \"5c4ccdc6-6205-4108-9146-75a7a963732e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210738 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9qwk\" (UniqueName: \"kubernetes.io/projected/723460ec-3116-468b-a628-1b03f5fd4239-kube-api-access-j9qwk\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gm6wm\" (UID: \"723460ec-3116-468b-a628-1b03f5fd4239\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvwpt\" (UniqueName: \"kubernetes.io/projected/e4de4a7c-49fd-48bc-8d5b-75727e7388de-kube-api-access-bvwpt\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210783 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-887g7\" (UniqueName: \"kubernetes.io/projected/7367c4a1-c098-4811-80ba-455509d27216-kube-api-access-887g7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-6fhjd\" (UID: \"7367c4a1-c098-4811-80ba-455509d27216\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210811 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2f8\" (UniqueName: \"kubernetes.io/projected/d3c55b89-b070-410d-8436-a101b0f313cf-kube-api-access-5m2f8\") pod \"horizon-operator-controller-manager-68c6d99b8f-lwgx2\" (UID: \"d3c55b89-b070-410d-8436-a101b0f313cf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.210837 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.211020 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvb9\" (UniqueName: \"kubernetes.io/projected/d2216dc0-19da-4872-8e82-579f6bd60513-kube-api-access-clvb9\") pod \"nova-operator-controller-manager-697bc559fc-phs84\" (UID: \"d2216dc0-19da-4872-8e82-579f6bd60513\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.211134 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.212019 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.212602 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8ps4x" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.233509 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsd5n\" (UniqueName: \"kubernetes.io/projected/5c4ccdc6-6205-4108-9146-75a7a963732e-kube-api-access-vsd5n\") pod \"keystone-operator-controller-manager-7765d96ddf-s97rj\" (UID: \"5c4ccdc6-6205-4108-9146-75a7a963732e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.238588 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2f8\" (UniqueName: \"kubernetes.io/projected/d3c55b89-b070-410d-8436-a101b0f313cf-kube-api-access-5m2f8\") pod \"horizon-operator-controller-manager-68c6d99b8f-lwgx2\" (UID: \"d3c55b89-b070-410d-8436-a101b0f313cf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.253843 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.256675 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.257344 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.258921 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.260553 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.261717 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.264135 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.264839 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.265184 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wp57v" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.266881 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7ksbc" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.267841 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dzqlg" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.284433 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.285492 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.292278 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-s4z6w" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.299151 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.302679 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.309917 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.311641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvwpt\" (UniqueName: \"kubernetes.io/projected/e4de4a7c-49fd-48bc-8d5b-75727e7388de-kube-api-access-bvwpt\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.312884 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-887g7\" (UniqueName: \"kubernetes.io/projected/7367c4a1-c098-4811-80ba-455509d27216-kube-api-access-887g7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-6fhjd\" (UID: \"7367c4a1-c098-4811-80ba-455509d27216\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.312916 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkbp\" (UniqueName: \"kubernetes.io/projected/057a4ce0-614e-436a-aaf5-300d5ce6661c-kube-api-access-lzkbp\") pod \"swift-operator-controller-manager-5f8c65bbfc-5pdxv\" (UID: \"057a4ce0-614e-436a-aaf5-300d5ce6661c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.312936 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmd9r\" (UniqueName: \"kubernetes.io/projected/5a01f2d2-8c90-4ccc-bf47-a4f973276988-kube-api-access-rmd9r\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.312964 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.312992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvb9\" (UniqueName: \"kubernetes.io/projected/d2216dc0-19da-4872-8e82-579f6bd60513-kube-api-access-clvb9\") pod \"nova-operator-controller-manager-697bc559fc-phs84\" (UID: \"d2216dc0-19da-4872-8e82-579f6bd60513\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313016 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhhq\" (UniqueName: \"kubernetes.io/projected/7c596dd6-5f26-4bb7-a771-8c1d57129209-kube-api-access-bkhhq\") pod \"manila-operator-controller-manager-7c79b5df47-xhm6n\" (UID: \"7c596dd6-5f26-4bb7-a771-8c1d57129209\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313033 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8hzb\" (UniqueName: \"kubernetes.io/projected/926767ef-1626-42a1-bd04-6d3f06d89f08-kube-api-access-r8hzb\") pod \"ovn-operator-controller-manager-b6456fdb6-htwmh\" (UID: \"926767ef-1626-42a1-bd04-6d3f06d89f08\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313060 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7cj\" (UniqueName: \"kubernetes.io/projected/246fe719-e899-408b-a962-702c5db22bfc-kube-api-access-ch7cj\") pod \"placement-operator-controller-manager-78f8948974-j5jc6\" (UID: \"246fe719-e899-408b-a962-702c5db22bfc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313103 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqqj7\" (UniqueName: \"kubernetes.io/projected/e3082dc8-ebbf-4a01-9120-5f1081af7801-kube-api-access-xqqj7\") pod \"ironic-operator-controller-manager-6c548fd776-hj5mh\" (UID: \"e3082dc8-ebbf-4a01-9120-5f1081af7801\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313135 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9qwk\" (UniqueName: \"kubernetes.io/projected/723460ec-3116-468b-a628-1b03f5fd4239-kube-api-access-j9qwk\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gm6wm\" (UID: \"723460ec-3116-468b-a628-1b03f5fd4239\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.313160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpl6x\" (UniqueName: \"kubernetes.io/projected/fc491fc5-9e88-4e1d-9848-ea8846acd82b-kube-api-access-jpl6x\") pod \"octavia-operator-controller-manager-998648c74-2rx8r\" (UID: \"fc491fc5-9e88-4e1d-9848-ea8846acd82b\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.313640 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.313717 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert podName:e4de4a7c-49fd-48bc-8d5b-75727e7388de nodeName:}" failed. No retries permitted until 2025-12-02 23:13:21.813698989 +0000 UTC m=+940.522253272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert") pod "infra-operator-controller-manager-57548d458d-dw6n2" (UID: "e4de4a7c-49fd-48bc-8d5b-75727e7388de") : secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.325336 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.335781 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.338428 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.358361 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.360239 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-g42t2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.379897 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvwpt\" (UniqueName: \"kubernetes.io/projected/e4de4a7c-49fd-48bc-8d5b-75727e7388de-kube-api-access-bvwpt\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.379942 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-887g7\" (UniqueName: \"kubernetes.io/projected/7367c4a1-c098-4811-80ba-455509d27216-kube-api-access-887g7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-6fhjd\" (UID: \"7367c4a1-c098-4811-80ba-455509d27216\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.380396 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhhq\" (UniqueName: \"kubernetes.io/projected/7c596dd6-5f26-4bb7-a771-8c1d57129209-kube-api-access-bkhhq\") pod \"manila-operator-controller-manager-7c79b5df47-xhm6n\" (UID: \"7c596dd6-5f26-4bb7-a771-8c1d57129209\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.380937 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9qwk\" (UniqueName: \"kubernetes.io/projected/723460ec-3116-468b-a628-1b03f5fd4239-kube-api-access-j9qwk\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gm6wm\" (UID: \"723460ec-3116-468b-a628-1b03f5fd4239\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.382719 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqqj7\" (UniqueName: \"kubernetes.io/projected/e3082dc8-ebbf-4a01-9120-5f1081af7801-kube-api-access-xqqj7\") pod \"ironic-operator-controller-manager-6c548fd776-hj5mh\" (UID: \"e3082dc8-ebbf-4a01-9120-5f1081af7801\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.386017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvb9\" (UniqueName: \"kubernetes.io/projected/d2216dc0-19da-4872-8e82-579f6bd60513-kube-api-access-clvb9\") pod \"nova-operator-controller-manager-697bc559fc-phs84\" (UID: \"d2216dc0-19da-4872-8e82-579f6bd60513\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.422565 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7cj\" (UniqueName: \"kubernetes.io/projected/246fe719-e899-408b-a962-702c5db22bfc-kube-api-access-ch7cj\") pod \"placement-operator-controller-manager-78f8948974-j5jc6\" (UID: \"246fe719-e899-408b-a962-702c5db22bfc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.422727 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.422855 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpl6x\" (UniqueName: \"kubernetes.io/projected/fc491fc5-9e88-4e1d-9848-ea8846acd82b-kube-api-access-jpl6x\") pod \"octavia-operator-controller-manager-998648c74-2rx8r\" (UID: \"fc491fc5-9e88-4e1d-9848-ea8846acd82b\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.423025 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.423100 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:21.923079401 +0000 UTC m=+940.631633684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.423143 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhj8g\" (UniqueName: \"kubernetes.io/projected/5430813d-ed61-496d-86b6-c9cc1d48aa1f-kube-api-access-bhj8g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-k7zsl\" (UID: \"5430813d-ed61-496d-86b6-c9cc1d48aa1f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.423237 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkbp\" (UniqueName: \"kubernetes.io/projected/057a4ce0-614e-436a-aaf5-300d5ce6661c-kube-api-access-lzkbp\") pod \"swift-operator-controller-manager-5f8c65bbfc-5pdxv\" (UID: \"057a4ce0-614e-436a-aaf5-300d5ce6661c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.423278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmd9r\" (UniqueName: \"kubernetes.io/projected/5a01f2d2-8c90-4ccc-bf47-a4f973276988-kube-api-access-rmd9r\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.423453 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8hzb\" (UniqueName: \"kubernetes.io/projected/926767ef-1626-42a1-bd04-6d3f06d89f08-kube-api-access-r8hzb\") pod \"ovn-operator-controller-manager-b6456fdb6-htwmh\" (UID: \"926767ef-1626-42a1-bd04-6d3f06d89f08\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.437178 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.437745 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.444760 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.449782 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.471283 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmd9r\" (UniqueName: \"kubernetes.io/projected/5a01f2d2-8c90-4ccc-bf47-a4f973276988-kube-api-access-rmd9r\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.471722 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.477802 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7cj\" (UniqueName: \"kubernetes.io/projected/246fe719-e899-408b-a962-702c5db22bfc-kube-api-access-ch7cj\") pod \"placement-operator-controller-manager-78f8948974-j5jc6\" (UID: \"246fe719-e899-408b-a962-702c5db22bfc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.478493 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpl6x\" (UniqueName: \"kubernetes.io/projected/fc491fc5-9e88-4e1d-9848-ea8846acd82b-kube-api-access-jpl6x\") pod \"octavia-operator-controller-manager-998648c74-2rx8r\" (UID: \"fc491fc5-9e88-4e1d-9848-ea8846acd82b\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.479204 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkbp\" (UniqueName: \"kubernetes.io/projected/057a4ce0-614e-436a-aaf5-300d5ce6661c-kube-api-access-lzkbp\") pod \"swift-operator-controller-manager-5f8c65bbfc-5pdxv\" (UID: \"057a4ce0-614e-436a-aaf5-300d5ce6661c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.480860 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.485842 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9l4s7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.486408 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.486864 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8hzb\" (UniqueName: \"kubernetes.io/projected/926767ef-1626-42a1-bd04-6d3f06d89f08-kube-api-access-r8hzb\") pod \"ovn-operator-controller-manager-b6456fdb6-htwmh\" (UID: \"926767ef-1626-42a1-bd04-6d3f06d89f08\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.503719 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.525329 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcg77\" (UniqueName: \"kubernetes.io/projected/e6b63e17-4749-429b-8214-92fa7eecfd3c-kube-api-access-bcg77\") pod \"test-operator-controller-manager-5854674fcc-wp9kf\" (UID: \"e6b63e17-4749-429b-8214-92fa7eecfd3c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.525414 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhj8g\" (UniqueName: \"kubernetes.io/projected/5430813d-ed61-496d-86b6-c9cc1d48aa1f-kube-api-access-bhj8g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-k7zsl\" (UID: \"5430813d-ed61-496d-86b6-c9cc1d48aa1f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.543609 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.552754 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.553866 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.556769 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jrzxm" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.557116 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhj8g\" (UniqueName: \"kubernetes.io/projected/5430813d-ed61-496d-86b6-c9cc1d48aa1f-kube-api-access-bhj8g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-k7zsl\" (UID: \"5430813d-ed61-496d-86b6-c9cc1d48aa1f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.559588 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.566275 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.626524 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962qp\" (UniqueName: \"kubernetes.io/projected/dbed5f2e-6049-4adc-a31c-bad1f30c7058-kube-api-access-962qp\") pod \"watcher-operator-controller-manager-c95d55f7c-jb8p7\" (UID: \"dbed5f2e-6049-4adc-a31c-bad1f30c7058\") " pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.626573 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcg77\" (UniqueName: \"kubernetes.io/projected/e6b63e17-4749-429b-8214-92fa7eecfd3c-kube-api-access-bcg77\") pod \"test-operator-controller-manager-5854674fcc-wp9kf\" (UID: \"e6b63e17-4749-429b-8214-92fa7eecfd3c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.635352 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.635865 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.649475 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcg77\" (UniqueName: \"kubernetes.io/projected/e6b63e17-4749-429b-8214-92fa7eecfd3c-kube-api-access-bcg77\") pod \"test-operator-controller-manager-5854674fcc-wp9kf\" (UID: \"e6b63e17-4749-429b-8214-92fa7eecfd3c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.653740 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.654461 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.654529 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.657941 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cs56n" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.658078 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.658124 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.678202 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.679612 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.681365 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.689380 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5gvg8" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.693529 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.697112 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.717398 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.730583 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.735506 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.731861 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzsf\" (UniqueName: \"kubernetes.io/projected/9cc67e14-1cb4-497f-b0f8-010c2e6d5717-kube-api-access-ljzsf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wccsg\" (UID: \"9cc67e14-1cb4-497f-b0f8-010c2e6d5717\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.736949 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962qp\" (UniqueName: \"kubernetes.io/projected/dbed5f2e-6049-4adc-a31c-bad1f30c7058-kube-api-access-962qp\") pod \"watcher-operator-controller-manager-c95d55f7c-jb8p7\" (UID: \"dbed5f2e-6049-4adc-a31c-bad1f30c7058\") " pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.737066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfcx\" (UniqueName: \"kubernetes.io/projected/61b2d273-f604-4fa0-baba-27dfbab9a350-kube-api-access-kpfcx\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.737158 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.751274 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.756169 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962qp\" (UniqueName: \"kubernetes.io/projected/dbed5f2e-6049-4adc-a31c-bad1f30c7058-kube-api-access-962qp\") pod \"watcher-operator-controller-manager-c95d55f7c-jb8p7\" (UID: \"dbed5f2e-6049-4adc-a31c-bad1f30c7058\") " pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.758217 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr"] Dec 02 23:13:21 crc kubenswrapper[4903]: W1202 23:13:21.775728 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5046b326_aad3_4aa9_ad84_96b3943a6147.slice/crio-c2ee034b1cf63118d0554e15df29524e068e64ee8f130cc3d910ad0129f0cf71 WatchSource:0}: Error finding container c2ee034b1cf63118d0554e15df29524e068e64ee8f130cc3d910ad0129f0cf71: Status 404 returned error can't find the container with id c2ee034b1cf63118d0554e15df29524e068e64ee8f130cc3d910ad0129f0cf71 Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.775784 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.826833 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw"] Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.830871 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.838915 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfcx\" (UniqueName: \"kubernetes.io/projected/61b2d273-f604-4fa0-baba-27dfbab9a350-kube-api-access-kpfcx\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.838966 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.839043 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.839066 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzsf\" (UniqueName: \"kubernetes.io/projected/9cc67e14-1cb4-497f-b0f8-010c2e6d5717-kube-api-access-ljzsf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wccsg\" (UID: \"9cc67e14-1cb4-497f-b0f8-010c2e6d5717\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.839090 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839211 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839378 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839429 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:22.339412814 +0000 UTC m=+941.047967097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839498 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839597 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:22.339570378 +0000 UTC m=+941.048124751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.839761 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert podName:e4de4a7c-49fd-48bc-8d5b-75727e7388de nodeName:}" failed. No retries permitted until 2025-12-02 23:13:22.839749892 +0000 UTC m=+941.548304175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert") pod "infra-operator-controller-manager-57548d458d-dw6n2" (UID: "e4de4a7c-49fd-48bc-8d5b-75727e7388de") : secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.856207 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfcx\" (UniqueName: \"kubernetes.io/projected/61b2d273-f604-4fa0-baba-27dfbab9a350-kube-api-access-kpfcx\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.861486 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzsf\" (UniqueName: \"kubernetes.io/projected/9cc67e14-1cb4-497f-b0f8-010c2e6d5717-kube-api-access-ljzsf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wccsg\" (UID: \"9cc67e14-1cb4-497f-b0f8-010c2e6d5717\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.884050 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.941973 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.942130 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: E1202 23:13:21.942176 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:22.942162232 +0000 UTC m=+941.650716515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:21 crc kubenswrapper[4903]: I1202 23:13:21.981383 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.006707 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.023911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" event={"ID":"d3c55b89-b070-410d-8436-a101b0f313cf","Type":"ContainerStarted","Data":"2d57d69f8799cefac743aa941162b05cf22412f199395bd04e4656ef9fd8dc2b"} Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.026807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" event={"ID":"5046b326-aad3-4aa9-ad84-96b3943a6147","Type":"ContainerStarted","Data":"c2ee034b1cf63118d0554e15df29524e068e64ee8f130cc3d910ad0129f0cf71"} Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.027808 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" event={"ID":"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4","Type":"ContainerStarted","Data":"3508ffa2fa6cf42d6b11793d8bb781400ef3b35ff96186b4df789e20fa89552f"} Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.028746 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" event={"ID":"8f5feda5-281a-4c4f-be95-7b96ecc273f9","Type":"ContainerStarted","Data":"4c3f0cc073ff19978049aad5a67a03899e00441d6a44cda934498f9a7559d149"} Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.029642 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" event={"ID":"58ddb811-8791-4420-ae35-b3521289b565","Type":"ContainerStarted","Data":"6a2d1b659e203c2896a894081f9479464ec6e15a9333efd4cfac899675707fad"} Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.069053 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.075982 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.235286 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.252798 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd"] Dec 02 23:13:22 crc kubenswrapper[4903]: W1202 23:13:22.277073 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723460ec_3116_468b_a628_1b03f5fd4239.slice/crio-cb1e2d95d34c7e0f649de1246ca3a20feb8f8d0aaa6cead28ddc23490a70c95c WatchSource:0}: Error finding container cb1e2d95d34c7e0f649de1246ca3a20feb8f8d0aaa6cead28ddc23490a70c95c: Status 404 returned error can't find the container with id cb1e2d95d34c7e0f649de1246ca3a20feb8f8d0aaa6cead28ddc23490a70c95c Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.280090 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.350507 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.350628 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.350829 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.350870 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.350901 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:23.350883568 +0000 UTC m=+942.059437851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.350989 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:23.35096288 +0000 UTC m=+942.059517163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.382668 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.387377 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.394852 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.400444 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-phs84"] Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.412388 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clvb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-phs84_openstack-operators(d2216dc0-19da-4872-8e82-579f6bd60513): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.414556 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clvb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-phs84_openstack-operators(d2216dc0-19da-4872-8e82-579f6bd60513): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.416039 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" podUID="d2216dc0-19da-4872-8e82-579f6bd60513" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.498780 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl"] Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.511990 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh"] Dec 02 23:13:22 crc kubenswrapper[4903]: W1202 23:13:22.514311 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod926767ef_1626_42a1_bd04_6d3f06d89f08.slice/crio-e48e415254b706568406a50586b572cff42b8024c351c4384021f636452613df WatchSource:0}: Error finding container e48e415254b706568406a50586b572cff42b8024c351c4384021f636452613df: Status 404 returned error can't find the container with id e48e415254b706568406a50586b572cff42b8024c351c4384021f636452613df Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.517120 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf"] Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.520133 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8hzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-htwmh_openstack-operators(926767ef-1626-42a1-bd04-6d3f06d89f08): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.526784 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7"] Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.531828 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8hzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-htwmh_openstack-operators(926767ef-1626-42a1-bd04-6d3f06d89f08): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.533065 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" podUID="926767ef-1626-42a1-bd04-6d3f06d89f08" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.535734 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcg77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wp9kf_openstack-operators(e6b63e17-4749-429b-8214-92fa7eecfd3c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.535739 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhj8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-k7zsl_openstack-operators(5430813d-ed61-496d-86b6-c9cc1d48aa1f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.537385 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv"] Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.539671 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcg77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wp9kf_openstack-operators(e6b63e17-4749-429b-8214-92fa7eecfd3c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.539813 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhj8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-k7zsl_openstack-operators(5430813d-ed61-496d-86b6-c9cc1d48aa1f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.540801 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" podUID="e6b63e17-4749-429b-8214-92fa7eecfd3c" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.540890 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" podUID="5430813d-ed61-496d-86b6-c9cc1d48aa1f" Dec 02 23:13:22 crc kubenswrapper[4903]: W1202 23:13:22.546990 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057a4ce0_614e_436a_aaf5_300d5ce6661c.slice/crio-f9482135f0c9a0daec20af129dc7efa60412ecb369fd6cd9ff967795ce8c1dd5 WatchSource:0}: Error finding container f9482135f0c9a0daec20af129dc7efa60412ecb369fd6cd9ff967795ce8c1dd5: Status 404 returned error can't find the container with id f9482135f0c9a0daec20af129dc7efa60412ecb369fd6cd9ff967795ce8c1dd5 Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.553952 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzkbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-5pdxv_openstack-operators(057a4ce0-614e-436a-aaf5-300d5ce6661c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.557149 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzkbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-5pdxv_openstack-operators(057a4ce0-614e-436a-aaf5-300d5ce6661c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.558344 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" podUID="057a4ce0-614e-436a-aaf5-300d5ce6661c" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.666286 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg"] Dec 02 23:13:22 crc kubenswrapper[4903]: W1202 23:13:22.674266 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc67e14_1cb4_497f_b0f8_010c2e6d5717.slice/crio-56c3c92b7bdcbc99ba8c347bc02aa15fbe95bd1afceacbbefdde68d944a0de37 WatchSource:0}: Error finding container 56c3c92b7bdcbc99ba8c347bc02aa15fbe95bd1afceacbbefdde68d944a0de37: Status 404 returned error can't find the container with id 56c3c92b7bdcbc99ba8c347bc02aa15fbe95bd1afceacbbefdde68d944a0de37 Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.677393 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ljzsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wccsg_openstack-operators(9cc67e14-1cb4-497f-b0f8-010c2e6d5717): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.678586 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" podUID="9cc67e14-1cb4-497f-b0f8-010c2e6d5717" Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.859493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.859678 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.859748 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert podName:e4de4a7c-49fd-48bc-8d5b-75727e7388de nodeName:}" failed. No retries permitted until 2025-12-02 23:13:24.859730858 +0000 UTC m=+943.568285131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert") pod "infra-operator-controller-manager-57548d458d-dw6n2" (UID: "e4de4a7c-49fd-48bc-8d5b-75727e7388de") : secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: I1202 23:13:22.960723 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.960893 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:22 crc kubenswrapper[4903]: E1202 23:13:22.960957 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:24.960938228 +0000 UTC m=+943.669492501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.039727 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" event={"ID":"246fe719-e899-408b-a962-702c5db22bfc","Type":"ContainerStarted","Data":"03f76d1ef6cd5db5255f57a93c71b7d2fdd1ead92072ce5825cf7750fa7c4c28"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.041771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" event={"ID":"e6b63e17-4749-429b-8214-92fa7eecfd3c","Type":"ContainerStarted","Data":"6ad2eafcc8d6bf6b9ecb8607cfd1ff73f30132994b5f98be13fa426696be8d36"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.043154 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" event={"ID":"7367c4a1-c098-4811-80ba-455509d27216","Type":"ContainerStarted","Data":"77f24e9f0c3da4d488d96d16f4aa3c14cff633794cfe9c97f1045e1ca0b555af"} Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.044319 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" podUID="e6b63e17-4749-429b-8214-92fa7eecfd3c" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.044435 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" event={"ID":"723460ec-3116-468b-a628-1b03f5fd4239","Type":"ContainerStarted","Data":"cb1e2d95d34c7e0f649de1246ca3a20feb8f8d0aaa6cead28ddc23490a70c95c"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.045644 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" event={"ID":"dbed5f2e-6049-4adc-a31c-bad1f30c7058","Type":"ContainerStarted","Data":"d69cb3a16fe6a7c57b363d412fe2d5c302cbdcca37e304de8ad1f33fed8d6a4c"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.047251 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" event={"ID":"d0be2ea9-978d-4c79-a623-3b752547d546","Type":"ContainerStarted","Data":"03e5427692c43069b6f750c486c5a94c1c2160f04fc9b532f1300b44ddca3f1f"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.048722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" event={"ID":"5430813d-ed61-496d-86b6-c9cc1d48aa1f","Type":"ContainerStarted","Data":"a7e350b77f5f39e20ba32bab6cd7a62bc0bdc3adca1aff0df2034c9315d61990"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.050254 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" event={"ID":"d2216dc0-19da-4872-8e82-579f6bd60513","Type":"ContainerStarted","Data":"e537054658aefaf5a24bd939ae9a28537c5237d267b79eb76ff90112b61a86fb"} Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.050643 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" podUID="5430813d-ed61-496d-86b6-c9cc1d48aa1f" Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.051812 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" podUID="d2216dc0-19da-4872-8e82-579f6bd60513" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.069502 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.069547 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.071066 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" event={"ID":"5c4ccdc6-6205-4108-9146-75a7a963732e","Type":"ContainerStarted","Data":"3f6f1c34da5f8af374c8263a562d5d74bfc929b30e0d16251d8c681fa934cc13"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.072352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" event={"ID":"e3082dc8-ebbf-4a01-9120-5f1081af7801","Type":"ContainerStarted","Data":"b66a49754f9d54a41aa4067225e74e48922e92b91a2ea209ef083c6cf68aa41f"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.073518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" event={"ID":"057a4ce0-614e-436a-aaf5-300d5ce6661c","Type":"ContainerStarted","Data":"f9482135f0c9a0daec20af129dc7efa60412ecb369fd6cd9ff967795ce8c1dd5"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.075259 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" event={"ID":"7c596dd6-5f26-4bb7-a771-8c1d57129209","Type":"ContainerStarted","Data":"1e102d5ba6455534130b24713f5bbaa4df0ef7598615d4f78b585f5383194eb0"} Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.076158 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" podUID="057a4ce0-614e-436a-aaf5-300d5ce6661c" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.076881 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" event={"ID":"926767ef-1626-42a1-bd04-6d3f06d89f08","Type":"ContainerStarted","Data":"e48e415254b706568406a50586b572cff42b8024c351c4384021f636452613df"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.078167 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" event={"ID":"fc491fc5-9e88-4e1d-9848-ea8846acd82b","Type":"ContainerStarted","Data":"75c9474d205d5b8d2efd97b5b47433e3cc822656c186f07c418d790dd0c4a848"} Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.079310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" event={"ID":"9cc67e14-1cb4-497f-b0f8-010c2e6d5717","Type":"ContainerStarted","Data":"56c3c92b7bdcbc99ba8c347bc02aa15fbe95bd1afceacbbefdde68d944a0de37"} Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.079409 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" podUID="926767ef-1626-42a1-bd04-6d3f06d89f08" Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.080694 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" podUID="9cc67e14-1cb4-497f-b0f8-010c2e6d5717" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.369614 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:23 crc kubenswrapper[4903]: I1202 23:13:23.369732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.369857 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.369904 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:25.36989057 +0000 UTC m=+944.078444853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.370192 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:23 crc kubenswrapper[4903]: E1202 23:13:23.370263 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:25.370244728 +0000 UTC m=+944.078799011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.088411 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" podUID="9cc67e14-1cb4-497f-b0f8-010c2e6d5717" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.088575 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" podUID="5430813d-ed61-496d-86b6-c9cc1d48aa1f" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.089144 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" podUID="057a4ce0-614e-436a-aaf5-300d5ce6661c" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.089191 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" podUID="926767ef-1626-42a1-bd04-6d3f06d89f08" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.090536 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" podUID="e6b63e17-4749-429b-8214-92fa7eecfd3c" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.107442 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" podUID="d2216dc0-19da-4872-8e82-579f6bd60513" Dec 02 23:13:24 crc kubenswrapper[4903]: I1202 23:13:24.894865 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.895038 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.895122 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert podName:e4de4a7c-49fd-48bc-8d5b-75727e7388de nodeName:}" failed. No retries permitted until 2025-12-02 23:13:28.895102795 +0000 UTC m=+947.603657078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert") pod "infra-operator-controller-manager-57548d458d-dw6n2" (UID: "e4de4a7c-49fd-48bc-8d5b-75727e7388de") : secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:24 crc kubenswrapper[4903]: I1202 23:13:24.996241 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.996515 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:24 crc kubenswrapper[4903]: E1202 23:13:24.996638 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:28.996610163 +0000 UTC m=+947.705164486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:25 crc kubenswrapper[4903]: I1202 23:13:25.401354 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:25 crc kubenswrapper[4903]: I1202 23:13:25.401529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:25 crc kubenswrapper[4903]: E1202 23:13:25.401554 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:25 crc kubenswrapper[4903]: E1202 23:13:25.401669 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:29.401627818 +0000 UTC m=+948.110182141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:25 crc kubenswrapper[4903]: E1202 23:13:25.401752 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:25 crc kubenswrapper[4903]: E1202 23:13:25.401835 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:29.401811823 +0000 UTC m=+948.110366146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:28 crc kubenswrapper[4903]: I1202 23:13:28.955377 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:28 crc kubenswrapper[4903]: E1202 23:13:28.955521 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:28 crc kubenswrapper[4903]: E1202 23:13:28.955581 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert podName:e4de4a7c-49fd-48bc-8d5b-75727e7388de nodeName:}" failed. No retries permitted until 2025-12-02 23:13:36.955565288 +0000 UTC m=+955.664119571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert") pod "infra-operator-controller-manager-57548d458d-dw6n2" (UID: "e4de4a7c-49fd-48bc-8d5b-75727e7388de") : secret "infra-operator-webhook-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: I1202 23:13:29.057002 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.057339 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.057419 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:37.057396154 +0000 UTC m=+955.765950457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: I1202 23:13:29.462756 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:29 crc kubenswrapper[4903]: I1202 23:13:29.462865 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.462984 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.463057 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:37.463039485 +0000 UTC m=+956.171593768 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.462986 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:29 crc kubenswrapper[4903]: E1202 23:13:29.463182 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:37.463163538 +0000 UTC m=+956.171717891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:36 crc kubenswrapper[4903]: I1202 23:13:36.973053 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:36 crc kubenswrapper[4903]: I1202 23:13:36.982308 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4de4a7c-49fd-48bc-8d5b-75727e7388de-cert\") pod \"infra-operator-controller-manager-57548d458d-dw6n2\" (UID: \"e4de4a7c-49fd-48bc-8d5b-75727e7388de\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:37 crc kubenswrapper[4903]: I1202 23:13:37.075145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.075277 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.075343 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert podName:5a01f2d2-8c90-4ccc-bf47-a4f973276988 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:53.075327677 +0000 UTC m=+971.783881960 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" (UID: "5a01f2d2-8c90-4ccc-bf47-a4f973276988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 23:13:37 crc kubenswrapper[4903]: I1202 23:13:37.225041 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:37 crc kubenswrapper[4903]: I1202 23:13:37.481106 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:37 crc kubenswrapper[4903]: I1202 23:13:37.481262 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.481370 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.481451 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.481524 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:53.481468879 +0000 UTC m=+972.190023162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "metrics-server-cert" not found Dec 02 23:13:37 crc kubenswrapper[4903]: E1202 23:13:37.481549 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs podName:61b2d273-f604-4fa0-baba-27dfbab9a350 nodeName:}" failed. No retries permitted until 2025-12-02 23:13:53.481540131 +0000 UTC m=+972.190094414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs") pod "openstack-operator-controller-manager-5b9bc7567f-6prdr" (UID: "61b2d273-f604-4fa0-baba-27dfbab9a350") : secret "webhook-server-cert" not found Dec 02 23:13:38 crc kubenswrapper[4903]: E1202 23:13:38.816231 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 02 23:13:38 crc kubenswrapper[4903]: E1202 23:13:38.816849 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ch7cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-j5jc6_openstack-operators(246fe719-e899-408b-a962-702c5db22bfc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:13:39 crc kubenswrapper[4903]: E1202 23:13:39.527219 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 02 23:13:39 crc kubenswrapper[4903]: E1202 23:13:39.527457 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkhhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-xhm6n_openstack-operators(7c596dd6-5f26-4bb7-a771-8c1d57129209): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:13:40 crc kubenswrapper[4903]: E1202 23:13:40.394338 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 02 23:13:40 crc kubenswrapper[4903]: E1202 23:13:40.394502 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vsd5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-s97rj_openstack-operators(5c4ccdc6-6205-4108-9146-75a7a963732e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:13:40 crc kubenswrapper[4903]: E1202 23:13:40.490012 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec" Dec 02 23:13:40 crc kubenswrapper[4903]: E1202 23:13:40.490328 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec" Dec 02 23:13:40 crc kubenswrapper[4903]: E1202 23:13:40.490502 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.2:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-962qp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-c95d55f7c-jb8p7_openstack-operators(dbed5f2e-6049-4adc-a31c-bad1f30c7058): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:13:45 crc kubenswrapper[4903]: I1202 23:13:45.814102 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2"] Dec 02 23:13:46 crc kubenswrapper[4903]: W1202 23:13:46.111521 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4de4a7c_49fd_48bc_8d5b_75727e7388de.slice/crio-c56b109ceb1c8586628d7f3e8749c709435a261575ba68ed14ef8dfe2a447b51 WatchSource:0}: Error finding container c56b109ceb1c8586628d7f3e8749c709435a261575ba68ed14ef8dfe2a447b51: Status 404 returned error can't find the container with id c56b109ceb1c8586628d7f3e8749c709435a261575ba68ed14ef8dfe2a447b51 Dec 02 23:13:46 crc kubenswrapper[4903]: I1202 23:13:46.256410 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" event={"ID":"e4de4a7c-49fd-48bc-8d5b-75727e7388de","Type":"ContainerStarted","Data":"c56b109ceb1c8586628d7f3e8749c709435a261575ba68ed14ef8dfe2a447b51"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.293188 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" event={"ID":"723460ec-3116-468b-a628-1b03f5fd4239","Type":"ContainerStarted","Data":"bdabd70ec7ccb69d57e5f35ecf005e8bded3e76c371cfe5e672baf10c845b3e6"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.304670 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" event={"ID":"e3082dc8-ebbf-4a01-9120-5f1081af7801","Type":"ContainerStarted","Data":"60923efc3af13bb9794b47824c5cac0be9b659282d3b7e7d6c099e8c7578275c"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.319797 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" event={"ID":"fc491fc5-9e88-4e1d-9848-ea8846acd82b","Type":"ContainerStarted","Data":"047e5cee746edeca21cbf8606747d313bd39cc5a505ffd481df79a2d1ad0d082"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.341754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" event={"ID":"7367c4a1-c098-4811-80ba-455509d27216","Type":"ContainerStarted","Data":"e657735cc79d5c44d36c0319192b9adf627cdfb2067cf4b3a6dd952989e1b28f"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.352888 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" event={"ID":"58ddb811-8791-4420-ae35-b3521289b565","Type":"ContainerStarted","Data":"af72ec9ae3c4ccb3a17585657ae3834600b5d5b82f48af728bd1dbb02ba84feb"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.355006 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" event={"ID":"d3c55b89-b070-410d-8436-a101b0f313cf","Type":"ContainerStarted","Data":"4f29c3a4d24796dcdecfb6b61f1cc8f88590f5ffe0f21b1b70d2302b3076bb5c"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.360911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" event={"ID":"5046b326-aad3-4aa9-ad84-96b3943a6147","Type":"ContainerStarted","Data":"bc36da505c83780aea887bb6eb1e875eacd274475d241dd6c4859b6a04e5c5ae"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.374430 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" event={"ID":"d0be2ea9-978d-4c79-a623-3b752547d546","Type":"ContainerStarted","Data":"45c595536a1a678a9d86018cf30795948a325d84d80d4f843d2b01a2bfadf6d2"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.384635 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" event={"ID":"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4","Type":"ContainerStarted","Data":"67cfbe5a130e16621a366bed4bc5f01eddb92bd6355cc945c85293a71ae71ef8"} Dec 02 23:13:47 crc kubenswrapper[4903]: I1202 23:13:47.415706 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" event={"ID":"8f5feda5-281a-4c4f-be95-7b96ecc273f9","Type":"ContainerStarted","Data":"e0650ce2441301b33100bf8bc26d1ca13aa80cc396c594fa3f113de3c71d1f32"} Dec 02 23:13:48 crc kubenswrapper[4903]: I1202 23:13:48.427601 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" event={"ID":"d2216dc0-19da-4872-8e82-579f6bd60513","Type":"ContainerStarted","Data":"d8c09f6c2494f1052af4a6ab59379f88a83e3525e5149c83c99d297174596f9b"} Dec 02 23:13:49 crc kubenswrapper[4903]: I1202 23:13:49.436199 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" event={"ID":"5430813d-ed61-496d-86b6-c9cc1d48aa1f","Type":"ContainerStarted","Data":"bc27923686c472b0a7ac2fe141cface5594567ccd07f166c7e20c7ae4c588fec"} Dec 02 23:13:49 crc kubenswrapper[4903]: I1202 23:13:49.437526 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" event={"ID":"926767ef-1626-42a1-bd04-6d3f06d89f08","Type":"ContainerStarted","Data":"9aa97486838ee1fd6ede7d81dee7159f57617293a28affaf0af9333d8ffd4572"} Dec 02 23:13:50 crc kubenswrapper[4903]: E1202 23:13:50.232194 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" podUID="5c4ccdc6-6205-4108-9146-75a7a963732e" Dec 02 23:13:50 crc kubenswrapper[4903]: E1202 23:13:50.328010 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" podUID="dbed5f2e-6049-4adc-a31c-bad1f30c7058" Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.448035 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" event={"ID":"057a4ce0-614e-436a-aaf5-300d5ce6661c","Type":"ContainerStarted","Data":"60ed53c8f0a1048b56eab382bb9f0cbe2898dfed37ec8e8d2bab579410ded74a"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.448088 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" event={"ID":"057a4ce0-614e-436a-aaf5-300d5ce6661c","Type":"ContainerStarted","Data":"dda79dd57ed224040a933f342f1c9be2cc5516dc6327cb8b2279a366ee960531"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.448261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.449945 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" event={"ID":"e4de4a7c-49fd-48bc-8d5b-75727e7388de","Type":"ContainerStarted","Data":"b0f8e8d5ed5323157e76317abccf41f89cd4b8b47b0c6a5bd8e9f703aa3c882c"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.455069 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" event={"ID":"9cc67e14-1cb4-497f-b0f8-010c2e6d5717","Type":"ContainerStarted","Data":"8290621420ce59561fa1225fb3295362dae3d0b2265d6352eb6e1d92daf0b7fc"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.456811 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" event={"ID":"5c4ccdc6-6205-4108-9146-75a7a963732e","Type":"ContainerStarted","Data":"b6d43b91f9160ea9a6b84097b8b9fa4d5949a01ce35462317b8d192895bf991b"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.458698 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" event={"ID":"dbed5f2e-6049-4adc-a31c-bad1f30c7058","Type":"ContainerStarted","Data":"06a8999790e27e235ccbe61acb7c734468517b8cf68a479d74ba2143d389a17a"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.468804 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" event={"ID":"e6b63e17-4749-429b-8214-92fa7eecfd3c","Type":"ContainerStarted","Data":"76ec33eca538efd8474cec6fe0b46cd9edafaf6b1a78903720e39decec69baee"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.468846 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" event={"ID":"e6b63e17-4749-429b-8214-92fa7eecfd3c","Type":"ContainerStarted","Data":"8b64ac6e55b8d4ec2a9e12d548836e9839182051455f5e3c744da61f3c4153e3"} Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.469434 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:13:50 crc kubenswrapper[4903]: E1202 23:13:50.470244 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" podUID="dbed5f2e-6049-4adc-a31c-bad1f30c7058" Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.513513 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" podStartSLOduration=5.9475016929999995 podStartE2EDuration="29.513496348s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.553827601 +0000 UTC m=+941.262381884" lastFinishedPulling="2025-12-02 23:13:46.119822236 +0000 UTC m=+964.828376539" observedRunningTime="2025-12-02 23:13:50.4766083 +0000 UTC m=+969.185162583" watchObservedRunningTime="2025-12-02 23:13:50.513496348 +0000 UTC m=+969.222050631" Dec 02 23:13:50 crc kubenswrapper[4903]: E1202 23:13:50.518932 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" podUID="7c596dd6-5f26-4bb7-a771-8c1d57129209" Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.535716 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" podStartSLOduration=5.918949289 podStartE2EDuration="29.535702074s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.535620573 +0000 UTC m=+941.244174856" lastFinishedPulling="2025-12-02 23:13:46.152373338 +0000 UTC m=+964.860927641" observedRunningTime="2025-12-02 23:13:50.532601078 +0000 UTC m=+969.241155361" watchObservedRunningTime="2025-12-02 23:13:50.535702074 +0000 UTC m=+969.244256357" Dec 02 23:13:50 crc kubenswrapper[4903]: I1202 23:13:50.579049 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wccsg" podStartSLOduration=5.6377642009999995 podStartE2EDuration="29.57903037s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.677237467 +0000 UTC m=+941.385791750" lastFinishedPulling="2025-12-02 23:13:46.618503646 +0000 UTC m=+965.327057919" observedRunningTime="2025-12-02 23:13:50.577179845 +0000 UTC m=+969.285734128" watchObservedRunningTime="2025-12-02 23:13:50.57903037 +0000 UTC m=+969.287584653" Dec 02 23:13:50 crc kubenswrapper[4903]: E1202 23:13:50.742935 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" podUID="246fe719-e899-408b-a962-702c5db22bfc" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.476662 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" event={"ID":"fc491fc5-9e88-4e1d-9848-ea8846acd82b","Type":"ContainerStarted","Data":"c1d573f6a671f6b624262b6a7c3bb45716e7485c598b5a0f11f639ad684c5246"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.477162 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.479165 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" event={"ID":"8f5feda5-281a-4c4f-be95-7b96ecc273f9","Type":"ContainerStarted","Data":"d378991365b1c31726b81aa097f3f738bb674f98037c71e116237405466e0158"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.479309 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.480801 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" event={"ID":"5430813d-ed61-496d-86b6-c9cc1d48aa1f","Type":"ContainerStarted","Data":"024048f5a84f2ece6536f3ed14cd7e882561ac4ef026804ff61b0c99399c7036"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.480904 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.482498 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" event={"ID":"d3c55b89-b070-410d-8436-a101b0f313cf","Type":"ContainerStarted","Data":"f00d4636bf53fe30bb14a388de108f2b8bd8088d023a71374b1f8d2e61293453"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.482575 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.484736 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" event={"ID":"723460ec-3116-468b-a628-1b03f5fd4239","Type":"ContainerStarted","Data":"7fa02fae645e091d097ca2f785211e85a99897a086f79078388cafab89918dd6"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.484880 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.486962 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" event={"ID":"d0be2ea9-978d-4c79-a623-3b752547d546","Type":"ContainerStarted","Data":"379297e5342dde061a3923131c388c934c5fc8de791d56790f878d16ba357e1b"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.487194 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.488662 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" event={"ID":"35bd5361-6683-4c7d-b26c-3cac8e7a5bf4","Type":"ContainerStarted","Data":"391536b3f1a49b32f528caa7c5c43692c501e01acdf7380a4cdd00981e46214e"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.488818 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.489933 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.490105 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.490559 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" event={"ID":"926767ef-1626-42a1-bd04-6d3f06d89f08","Type":"ContainerStarted","Data":"9a223b66664a426e0955da6bdee12329416852ea650539d47877c67b204e56f4"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.490698 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.492736 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" event={"ID":"e4de4a7c-49fd-48bc-8d5b-75727e7388de","Type":"ContainerStarted","Data":"09dc3ddbfd21b41a23b0cccb3d26295212f32349560c96cd80e620ac8876e50f"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.493245 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.495659 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" event={"ID":"7c596dd6-5f26-4bb7-a771-8c1d57129209","Type":"ContainerStarted","Data":"e18a9e2bfb75203287550e3259de123ac34a6039390c99b338fc2ce5d1e2e41c"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.501338 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" event={"ID":"7367c4a1-c098-4811-80ba-455509d27216","Type":"ContainerStarted","Data":"b1ba64468fcf0b27f6fc5886a838b64267697e3bf12a24f00fdec7a928318b81"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.501459 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.501470 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" podStartSLOduration=2.537594516 podStartE2EDuration="30.501457506s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.383356537 +0000 UTC m=+941.091910810" lastFinishedPulling="2025-12-02 23:13:50.347219517 +0000 UTC m=+969.055773800" observedRunningTime="2025-12-02 23:13:51.499156569 +0000 UTC m=+970.207710852" watchObservedRunningTime="2025-12-02 23:13:51.501457506 +0000 UTC m=+970.210011799" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.502807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" event={"ID":"5c4ccdc6-6205-4108-9146-75a7a963732e","Type":"ContainerStarted","Data":"0058070e99a795ad753a24755783c59dfffeec5260ace2c3fed3a6335532b8ac"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.502919 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.505857 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" event={"ID":"58ddb811-8791-4420-ae35-b3521289b565","Type":"ContainerStarted","Data":"3888381d952719f9c4205a5df2b330ce453ccb390dbed98bbee191fa53588cee"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.506024 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.507886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" event={"ID":"5046b326-aad3-4aa9-ad84-96b3943a6147","Type":"ContainerStarted","Data":"d658b4427ab911d08dacd3cf29c7ce55103217245c8d42138a1525ed5d560f25"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.507983 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.509445 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.510060 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" event={"ID":"e3082dc8-ebbf-4a01-9120-5f1081af7801","Type":"ContainerStarted","Data":"4ca1c2d651a709c6befaf521356bbda403bb6e5283516fefa86ba7edcbbd4cdd"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.510364 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.514931 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.515454 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" event={"ID":"d2216dc0-19da-4872-8e82-579f6bd60513","Type":"ContainerStarted","Data":"19b79681d31331ef149e98ede9de89b1fcdafacf65f7dbd7f117de208a66b20b"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.515585 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.518327 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" event={"ID":"246fe719-e899-408b-a962-702c5db22bfc","Type":"ContainerStarted","Data":"81f4a3fa985a7dd0673bcb3b783ab17a160015e5ea1424b448d3a23e059bfeca"} Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.532879 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" podStartSLOduration=3.359970675 podStartE2EDuration="31.532864298s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:21.782846452 +0000 UTC m=+940.491400735" lastFinishedPulling="2025-12-02 23:13:49.955740065 +0000 UTC m=+968.664294358" observedRunningTime="2025-12-02 23:13:51.526790419 +0000 UTC m=+970.235344702" watchObservedRunningTime="2025-12-02 23:13:51.532864298 +0000 UTC m=+970.241418581" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.567876 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-78vhb" podStartSLOduration=3.7050085839999998 podStartE2EDuration="31.567857929s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.10504924 +0000 UTC m=+940.813603523" lastFinishedPulling="2025-12-02 23:13:49.967898575 +0000 UTC m=+968.676452868" observedRunningTime="2025-12-02 23:13:51.560202021 +0000 UTC m=+970.268756304" watchObservedRunningTime="2025-12-02 23:13:51.567857929 +0000 UTC m=+970.276412212" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.616260 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gm6wm" podStartSLOduration=3.940060188 podStartE2EDuration="31.61624046s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.279958613 +0000 UTC m=+940.988512916" lastFinishedPulling="2025-12-02 23:13:49.956138905 +0000 UTC m=+968.664693188" observedRunningTime="2025-12-02 23:13:51.601369134 +0000 UTC m=+970.309923417" watchObservedRunningTime="2025-12-02 23:13:51.61624046 +0000 UTC m=+970.324794743" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.646226 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" podStartSLOduration=3.248165918 podStartE2EDuration="30.646209437s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.535619703 +0000 UTC m=+941.244173986" lastFinishedPulling="2025-12-02 23:13:49.933663212 +0000 UTC m=+968.642217505" observedRunningTime="2025-12-02 23:13:51.640423785 +0000 UTC m=+970.348978088" watchObservedRunningTime="2025-12-02 23:13:51.646209437 +0000 UTC m=+970.354763720" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.685434 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" podStartSLOduration=3.049113496 podStartE2EDuration="31.685407151s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:21.812236335 +0000 UTC m=+940.520790618" lastFinishedPulling="2025-12-02 23:13:50.44852999 +0000 UTC m=+969.157084273" observedRunningTime="2025-12-02 23:13:51.678291416 +0000 UTC m=+970.386845699" watchObservedRunningTime="2025-12-02 23:13:51.685407151 +0000 UTC m=+970.393961434" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.725211 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" podStartSLOduration=27.911364056 podStartE2EDuration="31.725193271s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:46.120858142 +0000 UTC m=+964.829412445" lastFinishedPulling="2025-12-02 23:13:49.934687377 +0000 UTC m=+968.643241660" observedRunningTime="2025-12-02 23:13:51.704913151 +0000 UTC m=+970.413467434" watchObservedRunningTime="2025-12-02 23:13:51.725193271 +0000 UTC m=+970.433747554" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.745622 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" podStartSLOduration=3.274694501 podStartE2EDuration="30.745606802s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.519978629 +0000 UTC m=+941.228532912" lastFinishedPulling="2025-12-02 23:13:49.99089093 +0000 UTC m=+968.699445213" observedRunningTime="2025-12-02 23:13:51.73049613 +0000 UTC m=+970.439050413" watchObservedRunningTime="2025-12-02 23:13:51.745606802 +0000 UTC m=+970.454161085" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.770059 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" podStartSLOduration=3.73975437 podStartE2EDuration="31.770045384s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.009639042 +0000 UTC m=+940.718193325" lastFinishedPulling="2025-12-02 23:13:50.039930056 +0000 UTC m=+968.748484339" observedRunningTime="2025-12-02 23:13:51.764371005 +0000 UTC m=+970.472925288" watchObservedRunningTime="2025-12-02 23:13:51.770045384 +0000 UTC m=+970.478599667" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.802581 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" podStartSLOduration=4.119982805 podStartE2EDuration="31.802559954s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.270242524 +0000 UTC m=+940.978796807" lastFinishedPulling="2025-12-02 23:13:49.952819673 +0000 UTC m=+968.661373956" observedRunningTime="2025-12-02 23:13:51.800102273 +0000 UTC m=+970.508656556" watchObservedRunningTime="2025-12-02 23:13:51.802559954 +0000 UTC m=+970.511114237" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.844199 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-tsj9r" podStartSLOduration=3.584039499 podStartE2EDuration="31.844178928s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:21.696872587 +0000 UTC m=+940.405426870" lastFinishedPulling="2025-12-02 23:13:49.957012016 +0000 UTC m=+968.665566299" observedRunningTime="2025-12-02 23:13:51.839022721 +0000 UTC m=+970.547577024" watchObservedRunningTime="2025-12-02 23:13:51.844178928 +0000 UTC m=+970.552733211" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.898473 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" podStartSLOduration=3.664347983 podStartE2EDuration="31.898458173s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:21.782893543 +0000 UTC m=+940.491447826" lastFinishedPulling="2025-12-02 23:13:50.017003733 +0000 UTC m=+968.725558016" observedRunningTime="2025-12-02 23:13:51.872835833 +0000 UTC m=+970.581390116" watchObservedRunningTime="2025-12-02 23:13:51.898458173 +0000 UTC m=+970.607012456" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.907377 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" podStartSLOduration=3.069713295 podStartE2EDuration="31.90723614s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.111116219 +0000 UTC m=+940.819670502" lastFinishedPulling="2025-12-02 23:13:50.948639064 +0000 UTC m=+969.657193347" observedRunningTime="2025-12-02 23:13:51.898913585 +0000 UTC m=+970.607467868" watchObservedRunningTime="2025-12-02 23:13:51.90723614 +0000 UTC m=+970.615790433" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.960545 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" podStartSLOduration=4.353045139 podStartE2EDuration="31.960526871s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.412246168 +0000 UTC m=+941.120800461" lastFinishedPulling="2025-12-02 23:13:50.01972792 +0000 UTC m=+968.728282193" observedRunningTime="2025-12-02 23:13:51.943104212 +0000 UTC m=+970.651658495" watchObservedRunningTime="2025-12-02 23:13:51.960526871 +0000 UTC m=+970.669081154" Dec 02 23:13:51 crc kubenswrapper[4903]: I1202 23:13:51.980621 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-hj5mh" podStartSLOduration=4.326580558 podStartE2EDuration="31.980600064s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.384823693 +0000 UTC m=+941.093377976" lastFinishedPulling="2025-12-02 23:13:50.038843199 +0000 UTC m=+968.747397482" observedRunningTime="2025-12-02 23:13:51.978037121 +0000 UTC m=+970.686591404" watchObservedRunningTime="2025-12-02 23:13:51.980600064 +0000 UTC m=+970.689154347" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.527455 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" event={"ID":"7c596dd6-5f26-4bb7-a771-8c1d57129209","Type":"ContainerStarted","Data":"fbc8dcab95e4fc553b2893dcff490c52c62837441040e32ac9e8d53fe4318cb3"} Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.527752 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.529381 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" event={"ID":"dbed5f2e-6049-4adc-a31c-bad1f30c7058","Type":"ContainerStarted","Data":"c86dc88ef1006303e207c49a6147ff1161608229fa44aa5f4e17dcdd1fdfed94"} Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.529546 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.533176 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" event={"ID":"246fe719-e899-408b-a962-702c5db22bfc","Type":"ContainerStarted","Data":"dca9d6ed845ee55c4972d18ec1d8514b835fbe5dc0f6476a17bdb31003295b6a"} Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.533215 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.536169 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lwgx2" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.536327 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-phs84" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.536621 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-6dsjr" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.536809 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-j2zhw" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.538261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-6fhjd" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.538317 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-ddkpk" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.539758 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.552504 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" podStartSLOduration=2.940951397 podStartE2EDuration="32.552488866s" podCreationTimestamp="2025-12-02 23:13:20 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.249824152 +0000 UTC m=+940.958378435" lastFinishedPulling="2025-12-02 23:13:51.861361621 +0000 UTC m=+970.569915904" observedRunningTime="2025-12-02 23:13:52.546515278 +0000 UTC m=+971.255069561" watchObservedRunningTime="2025-12-02 23:13:52.552488866 +0000 UTC m=+971.261043149" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.573237 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" podStartSLOduration=2.010783614 podStartE2EDuration="31.573218875s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.386878694 +0000 UTC m=+941.095432977" lastFinishedPulling="2025-12-02 23:13:51.949313955 +0000 UTC m=+970.657868238" observedRunningTime="2025-12-02 23:13:52.56690868 +0000 UTC m=+971.275462963" watchObservedRunningTime="2025-12-02 23:13:52.573218875 +0000 UTC m=+971.281773168" Dec 02 23:13:52 crc kubenswrapper[4903]: I1202 23:13:52.665550 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" podStartSLOduration=2.612261313 podStartE2EDuration="31.665531536s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:22.519953738 +0000 UTC m=+941.228508031" lastFinishedPulling="2025-12-02 23:13:51.573223971 +0000 UTC m=+970.281778254" observedRunningTime="2025-12-02 23:13:52.658570955 +0000 UTC m=+971.367125238" watchObservedRunningTime="2025-12-02 23:13:52.665531536 +0000 UTC m=+971.374085819" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.069769 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.070153 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.070209 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.071076 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.071144 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c" gracePeriod=600 Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.149490 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.165503 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a01f2d2-8c90-4ccc-bf47-a4f973276988-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr\" (UID: \"5a01f2d2-8c90-4ccc-bf47-a4f973276988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.392320 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wp57v" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.400601 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.546949 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c" exitCode=0 Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.547114 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c"} Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.547174 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7"} Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.547191 4903 scope.go:117] "RemoveContainer" containerID="0b28e2dda158a594dc69e178bcdaca348a0340d9ffea7ad9bb02c01c43d2b522" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.555187 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.555312 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.562624 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-metrics-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.563125 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61b2d273-f604-4fa0-baba-27dfbab9a350-webhook-certs\") pod \"openstack-operator-controller-manager-5b9bc7567f-6prdr\" (UID: \"61b2d273-f604-4fa0-baba-27dfbab9a350\") " pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.645069 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr"] Dec 02 23:13:53 crc kubenswrapper[4903]: W1202 23:13:53.650097 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a01f2d2_8c90_4ccc_bf47_a4f973276988.slice/crio-0a41fdd13aae91dcf94b6f14862925375c68dbbcea5be53743f90fd087c31244 WatchSource:0}: Error finding container 0a41fdd13aae91dcf94b6f14862925375c68dbbcea5be53743f90fd087c31244: Status 404 returned error can't find the container with id 0a41fdd13aae91dcf94b6f14862925375c68dbbcea5be53743f90fd087c31244 Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.782760 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cs56n" Dec 02 23:13:53 crc kubenswrapper[4903]: I1202 23:13:53.791989 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:54 crc kubenswrapper[4903]: I1202 23:13:54.244576 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr"] Dec 02 23:13:54 crc kubenswrapper[4903]: W1202 23:13:54.253238 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b2d273_f604_4fa0_baba_27dfbab9a350.slice/crio-89549a008291075a052d9c3a75f9add77b4caf9fea9caae015fe3b531e050b59 WatchSource:0}: Error finding container 89549a008291075a052d9c3a75f9add77b4caf9fea9caae015fe3b531e050b59: Status 404 returned error can't find the container with id 89549a008291075a052d9c3a75f9add77b4caf9fea9caae015fe3b531e050b59 Dec 02 23:13:54 crc kubenswrapper[4903]: I1202 23:13:54.558220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" event={"ID":"5a01f2d2-8c90-4ccc-bf47-a4f973276988","Type":"ContainerStarted","Data":"0a41fdd13aae91dcf94b6f14862925375c68dbbcea5be53743f90fd087c31244"} Dec 02 23:13:54 crc kubenswrapper[4903]: I1202 23:13:54.562751 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" event={"ID":"61b2d273-f604-4fa0-baba-27dfbab9a350","Type":"ContainerStarted","Data":"806539857b1e673abfa7486a561599b92c8491462e2829b955ebb123f756f918"} Dec 02 23:13:54 crc kubenswrapper[4903]: I1202 23:13:54.562776 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" event={"ID":"61b2d273-f604-4fa0-baba-27dfbab9a350","Type":"ContainerStarted","Data":"89549a008291075a052d9c3a75f9add77b4caf9fea9caae015fe3b531e050b59"} Dec 02 23:13:54 crc kubenswrapper[4903]: I1202 23:13:54.563603 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.235749 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dw6n2" Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.267063 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" podStartSLOduration=36.267039222 podStartE2EDuration="36.267039222s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:13:54.606078591 +0000 UTC m=+973.314632874" watchObservedRunningTime="2025-12-02 23:13:57.267039222 +0000 UTC m=+975.975593515" Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.586180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" event={"ID":"5a01f2d2-8c90-4ccc-bf47-a4f973276988","Type":"ContainerStarted","Data":"d333597a4be05ff9ec15d15da59d8a6adb4b7ebcb64110482d486d3fadc2ec45"} Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.586239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" event={"ID":"5a01f2d2-8c90-4ccc-bf47-a4f973276988","Type":"ContainerStarted","Data":"67e7a52b84e91213d70e0636a25d807d92d5a2511988355aaba36de7b39ea8a5"} Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.586389 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:13:57 crc kubenswrapper[4903]: I1202 23:13:57.614376 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" podStartSLOduration=33.599050448 podStartE2EDuration="36.614351687s" podCreationTimestamp="2025-12-02 23:13:21 +0000 UTC" firstStartedPulling="2025-12-02 23:13:53.652240523 +0000 UTC m=+972.360794796" lastFinishedPulling="2025-12-02 23:13:56.667541752 +0000 UTC m=+975.376096035" observedRunningTime="2025-12-02 23:13:57.610459521 +0000 UTC m=+976.319013804" watchObservedRunningTime="2025-12-02 23:13:57.614351687 +0000 UTC m=+976.322905990" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.442648 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-s97rj" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.475448 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-xhm6n" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.638275 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-htwmh" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.685826 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-j5jc6" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.721694 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-5pdxv" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.768463 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-k7zsl" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.833897 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wp9kf" Dec 02 23:14:01 crc kubenswrapper[4903]: I1202 23:14:01.887507 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-c95d55f7c-jb8p7" Dec 02 23:14:03 crc kubenswrapper[4903]: I1202 23:14:03.408130 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr" Dec 02 23:14:03 crc kubenswrapper[4903]: I1202 23:14:03.800964 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b9bc7567f-6prdr" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.959382 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.960735 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.962789 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.964507 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.964741 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5hrs4" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.965079 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.976019 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.999506 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:21 crc kubenswrapper[4903]: I1202 23:14:21.999597 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4xv\" (UniqueName: \"kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.031803 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.032935 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.038014 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.045539 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.100757 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4xv\" (UniqueName: \"kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.100959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.101807 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.117520 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4xv\" (UniqueName: \"kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv\") pod \"dnsmasq-dns-588d85ddfc-j2885\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.202292 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.202334 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.202376 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmqrx\" (UniqueName: \"kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.281358 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.303646 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmqrx\" (UniqueName: \"kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.303761 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.303791 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.304591 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.305708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.327513 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmqrx\" (UniqueName: \"kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx\") pod \"dnsmasq-dns-5df5fc89fc-vv75w\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.346391 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.731347 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:22 crc kubenswrapper[4903]: W1202 23:14:22.739182 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod760b3fce_f12e_425b_8039_2bd47f1e8651.slice/crio-8ed5a7d00174843012de55617a7f8b953260487d95af548ca6e3ed2898b3bc0b WatchSource:0}: Error finding container 8ed5a7d00174843012de55617a7f8b953260487d95af548ca6e3ed2898b3bc0b: Status 404 returned error can't find the container with id 8ed5a7d00174843012de55617a7f8b953260487d95af548ca6e3ed2898b3bc0b Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.788967 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" event={"ID":"760b3fce-f12e-425b-8039-2bd47f1e8651","Type":"ContainerStarted","Data":"8ed5a7d00174843012de55617a7f8b953260487d95af548ca6e3ed2898b3bc0b"} Dec 02 23:14:22 crc kubenswrapper[4903]: I1202 23:14:22.823471 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:22 crc kubenswrapper[4903]: W1202 23:14:22.834987 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f625744_98dc_464f_896f_d9cdf237751e.slice/crio-2f801adee6186b627cfc918740b1062fa7ba342bdfcbf726457784b1e5997819 WatchSource:0}: Error finding container 2f801adee6186b627cfc918740b1062fa7ba342bdfcbf726457784b1e5997819: Status 404 returned error can't find the container with id 2f801adee6186b627cfc918740b1062fa7ba342bdfcbf726457784b1e5997819 Dec 02 23:14:23 crc kubenswrapper[4903]: I1202 23:14:23.804028 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" event={"ID":"8f625744-98dc-464f-896f-d9cdf237751e","Type":"ContainerStarted","Data":"2f801adee6186b627cfc918740b1062fa7ba342bdfcbf726457784b1e5997819"} Dec 02 23:14:25 crc kubenswrapper[4903]: I1202 23:14:25.981722 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.009481 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.013162 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.030353 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.079043 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.079086 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.079127 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7p8\" (UniqueName: \"kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.182014 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.182056 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.182097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7p8\" (UniqueName: \"kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.183420 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.183563 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.223889 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7p8\" (UniqueName: \"kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8\") pod \"dnsmasq-dns-76c98497-bsvrh\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.270560 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.289821 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.297188 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.303870 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.340187 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.386863 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.386914 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.386960 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qmc\" (UniqueName: \"kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.488586 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.488838 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qmc\" (UniqueName: \"kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.488916 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.489808 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.489842 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.508565 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qmc\" (UniqueName: \"kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc\") pod \"dnsmasq-dns-65885745f9-jm2v7\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.578929 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.625946 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.642578 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.643759 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.651152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.699042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5n64\" (UniqueName: \"kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.699117 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.699177 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.800763 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.801103 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.801180 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5n64\" (UniqueName: \"kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.801672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.802270 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.819268 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5n64\" (UniqueName: \"kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64\") pod \"dnsmasq-dns-f97ccc87-drwm2\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.981795 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:14:26 crc kubenswrapper[4903]: I1202 23:14:26.982096 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.064759 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:27 crc kubenswrapper[4903]: W1202 23:14:27.100351 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e3f2c63_d702_499c_843a_4dac4bc26ee4.slice/crio-e8300c8d0a4574d99a74426641a078048882702d005774b72d5c486254ce54d4 WatchSource:0}: Error finding container e8300c8d0a4574d99a74426641a078048882702d005774b72d5c486254ce54d4: Status 404 returned error can't find the container with id e8300c8d0a4574d99a74426641a078048882702d005774b72d5c486254ce54d4 Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.162563 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.163984 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.165663 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.165823 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.168360 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6h5p2" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.169472 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.169489 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.169604 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.172662 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.192087 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.208815 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.208856 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.208878 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.208907 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.208931 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209045 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209079 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209128 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209238 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209280 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.209341 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvgr\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310487 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310506 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310526 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310546 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310565 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310593 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310621 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310647 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.310686 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvgr\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.311087 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.316039 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.316264 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.316644 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.317103 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.317164 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.320568 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.326136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.331557 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.332105 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.332238 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvgr\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.342978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.393676 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.394897 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.397866 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.398080 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.398185 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.398291 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.400190 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.401167 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.401457 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x7h96" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413078 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413122 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413171 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413192 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413213 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413230 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzrq\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413252 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413293 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413317 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.413353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.419165 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.473019 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.500270 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514348 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514403 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514429 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514453 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514499 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514523 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514536 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514565 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514585 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514627 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpzrq\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.514836 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.515498 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.515828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.516033 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.516394 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.516580 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.520239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.520441 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.522207 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.529409 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.532874 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpzrq\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.539442 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.712532 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.770927 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.772174 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.773782 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.774012 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.775359 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.775616 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.775896 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.776492 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-rz69q" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.777625 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.781740 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.834789 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c98497-bsvrh" event={"ID":"5e3f2c63-d702-499c-843a-4dac4bc26ee4","Type":"ContainerStarted","Data":"e8300c8d0a4574d99a74426641a078048882702d005774b72d5c486254ce54d4"} Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.843722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" event={"ID":"19d3be82-34ff-443d-804a-61dec2277259","Type":"ContainerStarted","Data":"a255543f8041ff9dde7ea414dd3d634ae4fd04c6e30a5eef8524208c21fb1281"} Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919496 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919533 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919586 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919616 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8mx\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-kube-api-access-mc8mx\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919678 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/adbb82a2-c30f-4e59-be9c-9274739caf25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919725 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919756 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919772 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/adbb82a2-c30f-4e59-be9c-9274739caf25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:27 crc kubenswrapper[4903]: I1202 23:14:27.919786 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.020907 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.020981 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8mx\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-kube-api-access-mc8mx\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021029 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/adbb82a2-c30f-4e59-be9c-9274739caf25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021060 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021098 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021133 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021150 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/adbb82a2-c30f-4e59-be9c-9274739caf25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021164 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021192 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021504 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021211 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021576 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.021993 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.022858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.022955 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.023360 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adbb82a2-c30f-4e59-be9c-9274739caf25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.025118 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/adbb82a2-c30f-4e59-be9c-9274739caf25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.028271 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/adbb82a2-c30f-4e59-be9c-9274739caf25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.037491 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.040359 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.040917 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.042713 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8mx\" (UniqueName: \"kubernetes.io/projected/adbb82a2-c30f-4e59-be9c-9274739caf25-kube-api-access-mc8mx\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.043002 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"adbb82a2-c30f-4e59-be9c-9274739caf25\") " pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:28 crc kubenswrapper[4903]: I1202 23:14:28.099386 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.494936 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.497462 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.502572 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qldpb" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.502588 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.502607 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.503021 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.508128 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.521326 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648191 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648240 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648269 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kolla-config\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648303 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648332 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5qfz\" (UniqueName: \"kubernetes.io/projected/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kube-api-access-q5qfz\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648385 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.648416 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-default\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750041 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750106 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750680 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5qfz\" (UniqueName: \"kubernetes.io/projected/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kube-api-access-q5qfz\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750783 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-default\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750840 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750862 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.750902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kolla-config\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.751983 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kolla-config\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.752020 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.752339 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.752401 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.753692 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6eaac3fd-8033-42cd-90c3-5dfac716ae66-config-data-default\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.773423 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5qfz\" (UniqueName: \"kubernetes.io/projected/6eaac3fd-8033-42cd-90c3-5dfac716ae66-kube-api-access-q5qfz\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.776902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.778158 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eaac3fd-8033-42cd-90c3-5dfac716ae66-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.793535 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"6eaac3fd-8033-42cd-90c3-5dfac716ae66\") " pod="openstack/openstack-galera-0" Dec 02 23:14:29 crc kubenswrapper[4903]: I1202 23:14:29.821520 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.937682 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.939467 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.943525 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.943723 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g8vd9" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.943874 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.944180 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 23:14:30 crc kubenswrapper[4903]: I1202 23:14:30.951173 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.082462 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.082587 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083451 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083484 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5p6n\" (UniqueName: \"kubernetes.io/projected/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kube-api-access-b5p6n\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083565 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083724 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083753 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.083783 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621233 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621307 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5p6n\" (UniqueName: \"kubernetes.io/projected/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kube-api-access-b5p6n\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621372 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621427 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621449 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621474 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621531 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.621957 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.622369 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.622479 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.623420 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.633574 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.634125 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.635737 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa7901-a49c-433f-942c-a875c9ecd2ab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.673671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5p6n\" (UniqueName: \"kubernetes.io/projected/a3fa7901-a49c-433f-942c-a875c9ecd2ab-kube-api-access-b5p6n\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.674554 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3fa7901-a49c-433f-942c-a875c9ecd2ab\") " pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.691825 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.692925 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.696214 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.696418 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.696532 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zwd8g" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.726821 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.824248 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.824319 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.824340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzsg\" (UniqueName: \"kubernetes.io/projected/4fdb728a-100d-425d-b83c-245c770afa4b-kube-api-access-lrzsg\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.824602 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-kolla-config\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.824773 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-config-data\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.866693 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.925793 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-kolla-config\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.925833 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-config-data\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.925893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.925934 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.925954 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzsg\" (UniqueName: \"kubernetes.io/projected/4fdb728a-100d-425d-b83c-245c770afa4b-kube-api-access-lrzsg\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.926548 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-kolla-config\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.926812 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fdb728a-100d-425d-b83c-245c770afa4b-config-data\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.932564 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.942698 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fdb728a-100d-425d-b83c-245c770afa4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:31 crc kubenswrapper[4903]: I1202 23:14:31.944815 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzsg\" (UniqueName: \"kubernetes.io/projected/4fdb728a-100d-425d-b83c-245c770afa4b-kube-api-access-lrzsg\") pod \"memcached-0\" (UID: \"4fdb728a-100d-425d-b83c-245c770afa4b\") " pod="openstack/memcached-0" Dec 02 23:14:32 crc kubenswrapper[4903]: I1202 23:14:32.061690 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.148830 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.149846 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.153200 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jcpnc" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.171583 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.251211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whl67\" (UniqueName: \"kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67\") pod \"kube-state-metrics-0\" (UID: \"2372c82e-7656-4307-946c-155ec0d8cb3d\") " pod="openstack/kube-state-metrics-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.352264 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whl67\" (UniqueName: \"kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67\") pod \"kube-state-metrics-0\" (UID: \"2372c82e-7656-4307-946c-155ec0d8cb3d\") " pod="openstack/kube-state-metrics-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.380442 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whl67\" (UniqueName: \"kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67\") pod \"kube-state-metrics-0\" (UID: \"2372c82e-7656-4307-946c-155ec0d8cb3d\") " pod="openstack/kube-state-metrics-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.467804 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:14:33 crc kubenswrapper[4903]: I1202 23:14:33.895748 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" event={"ID":"34aa4a0b-387e-41eb-a104-c08208065d85","Type":"ContainerStarted","Data":"9f056f15278d0daecb1091aea6a6f63937acdb683440f5a1cea4f31e71b4f017"} Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.511924 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.514956 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.518281 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.518318 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.518333 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5p4fs" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.518515 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.518730 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.530003 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.530709 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670142 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670197 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670246 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670265 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670294 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670315 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670504 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgzp\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.670586 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.772359 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773410 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773439 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773457 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773481 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773496 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773532 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgzp\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.773556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.774985 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.777878 4903 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.778012 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/389fc60ed9b89584c09faa75d07c0667b0d3839786e48ead64fa3957a7dc98cb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.779806 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.780259 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.780568 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.782280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.784041 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.793526 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgzp\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.822782 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:34 crc kubenswrapper[4903]: I1202 23:14:34.846592 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.852669 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lkt78"] Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.854020 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.855962 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.856276 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5mfd5" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.858142 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.864527 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78"] Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.896820 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cs6mk"] Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.898608 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.910812 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cs6mk"] Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.911875 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-run\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.911920 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bn98\" (UniqueName: \"kubernetes.io/projected/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-kube-api-access-4bn98\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912039 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-lib\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912123 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-log-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912183 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f72d79d2-cc88-4d82-abb4-c24c823532cb-scripts\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912207 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912252 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912268 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-log\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912298 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-ovn-controller-tls-certs\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-combined-ca-bundle\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912428 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-etc-ovs\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912461 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-scripts\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:36 crc kubenswrapper[4903]: I1202 23:14:36.912484 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qp9h\" (UniqueName: \"kubernetes.io/projected/f72d79d2-cc88-4d82-abb4-c24c823532cb-kube-api-access-2qp9h\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.012913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.012946 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-log\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.012967 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-ovn-controller-tls-certs\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.012986 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-combined-ca-bundle\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013019 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-etc-ovs\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013041 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-scripts\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013063 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qp9h\" (UniqueName: \"kubernetes.io/projected/f72d79d2-cc88-4d82-abb4-c24c823532cb-kube-api-access-2qp9h\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013116 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-run\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013131 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bn98\" (UniqueName: \"kubernetes.io/projected/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-kube-api-access-4bn98\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013429 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-log\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013448 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-etc-ovs\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013507 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-run\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013460 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-lib\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013576 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-log-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013612 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f72d79d2-cc88-4d82-abb4-c24c823532cb-scripts\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013630 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013664 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f72d79d2-cc88-4d82-abb4-c24c823532cb-var-lib\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013718 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013781 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-run\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.013869 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-var-log-ovn\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.015704 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f72d79d2-cc88-4d82-abb4-c24c823532cb-scripts\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.015709 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-scripts\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.019139 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-combined-ca-bundle\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.028414 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-ovn-controller-tls-certs\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.032465 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bn98\" (UniqueName: \"kubernetes.io/projected/d72fba58-af32-4b1a-a883-4e76ec6dc3f4-kube-api-access-4bn98\") pod \"ovn-controller-lkt78\" (UID: \"d72fba58-af32-4b1a-a883-4e76ec6dc3f4\") " pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.033126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qp9h\" (UniqueName: \"kubernetes.io/projected/f72d79d2-cc88-4d82-abb4-c24c823532cb-kube-api-access-2qp9h\") pod \"ovn-controller-ovs-cs6mk\" (UID: \"f72d79d2-cc88-4d82-abb4-c24c823532cb\") " pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.174606 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78" Dec 02 23:14:37 crc kubenswrapper[4903]: I1202 23:14:37.219115 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.500805 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.503829 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.506972 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kcs2c" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.507019 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.507074 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.507072 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.507269 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.510500 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583062 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583228 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583566 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583641 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k67db\" (UniqueName: \"kubernetes.io/projected/635dddd5-1a09-4f9e-b82f-e45eee76b412-kube-api-access-k67db\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583776 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-config\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583848 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.583974 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685289 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685364 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685394 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685456 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685491 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k67db\" (UniqueName: \"kubernetes.io/projected/635dddd5-1a09-4f9e-b82f-e45eee76b412-kube-api-access-k67db\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685525 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-config\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685554 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.685865 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.686612 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-config\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.686756 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.687417 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.690002 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/635dddd5-1a09-4f9e-b82f-e45eee76b412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.692807 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.692898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.693901 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/635dddd5-1a09-4f9e-b82f-e45eee76b412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.717174 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k67db\" (UniqueName: \"kubernetes.io/projected/635dddd5-1a09-4f9e-b82f-e45eee76b412-kube-api-access-k67db\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.717529 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"635dddd5-1a09-4f9e-b82f-e45eee76b412\") " pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.730321 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.732887 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.737231 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.737519 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pgftd" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.737788 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.743774 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.749844 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.788716 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6cn\" (UniqueName: \"kubernetes.io/projected/4ab22df5-5c0a-42c6-a881-4529dd331e5f-kube-api-access-pv6cn\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.788789 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.788896 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-config\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.788996 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.789110 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.789177 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.789450 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.789526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.832340 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890629 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6cn\" (UniqueName: \"kubernetes.io/projected/4ab22df5-5c0a-42c6-a881-4529dd331e5f-kube-api-access-pv6cn\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890710 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890735 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-config\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890772 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890808 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890829 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890876 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.890895 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.891047 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.891352 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.892532 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.893352 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab22df5-5c0a-42c6-a881-4529dd331e5f-config\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.895312 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.898258 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.898270 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab22df5-5c0a-42c6-a881-4529dd331e5f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.909990 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6cn\" (UniqueName: \"kubernetes.io/projected/4ab22df5-5c0a-42c6-a881-4529dd331e5f-kube-api-access-pv6cn\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:40 crc kubenswrapper[4903]: I1202 23:14:40.913432 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4ab22df5-5c0a-42c6-a881-4529dd331e5f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:41 crc kubenswrapper[4903]: I1202 23:14:41.078555 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 23:14:42 crc kubenswrapper[4903]: I1202 23:14:42.030205 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 23:14:42 crc kubenswrapper[4903]: I1202 23:14:42.119218 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:14:42 crc kubenswrapper[4903]: W1202 23:14:42.451055 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3afcfb6b_f7ce_424a_be67_3ef69a367fdb.slice/crio-a02a93d29467d0aef12905fadb2d077c02f88559bbd115e1471a4ef683fcfbfa WatchSource:0}: Error finding container a02a93d29467d0aef12905fadb2d077c02f88559bbd115e1471a4ef683fcfbfa: Status 404 returned error can't find the container with id a02a93d29467d0aef12905fadb2d077c02f88559bbd115e1471a4ef683fcfbfa Dec 02 23:14:42 crc kubenswrapper[4903]: W1202 23:14:42.452071 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fa7901_a49c_433f_942c_a875c9ecd2ab.slice/crio-95f5935d96a8c5b45bce523cac7ba133bb5e32965428ded95892e86215cbe4b0 WatchSource:0}: Error finding container 95f5935d96a8c5b45bce523cac7ba133bb5e32965428ded95892e86215cbe4b0: Status 404 returned error can't find the container with id 95f5935d96a8c5b45bce523cac7ba133bb5e32965428ded95892e86215cbe4b0 Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.476712 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.476768 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.476911 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hd7p8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-76c98497-bsvrh_openstack(5e3f2c63-d702-499c-843a-4dac4bc26ee4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.478154 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-76c98497-bsvrh" podUID="5e3f2c63-d702-499c-843a-4dac4bc26ee4" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.497224 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.497277 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.497407 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kc4xv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-588d85ddfc-j2885_openstack(760b3fce-f12e-425b-8039-2bd47f1e8651): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.498712 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" podUID="760b3fce-f12e-425b-8039-2bd47f1e8651" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.510883 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.510943 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.511069 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j7qmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-65885745f9-jm2v7_openstack(19d3be82-34ff-443d-804a-61dec2277259): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.512479 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" podUID="19d3be82-34ff-443d-804a-61dec2277259" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.538877 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.539182 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.539316 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.2:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmqrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5df5fc89fc-vv75w_openstack(8f625744-98dc-464f-896f-d9cdf237751e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:14:42 crc kubenswrapper[4903]: E1202 23:14:42.541047 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" podUID="8f625744-98dc-464f-896f-d9cdf237751e" Dec 02 23:14:42 crc kubenswrapper[4903]: I1202 23:14:42.932237 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:14:42 crc kubenswrapper[4903]: I1202 23:14:42.951772 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 02 23:14:42 crc kubenswrapper[4903]: W1202 23:14:42.966865 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1743f362_cc56_4c25_a31d_7a78f269f570.slice/crio-81ca914cc8757707bf88b5061d4f0283fee1578c60eaf59983c41918a7d4a0dd WatchSource:0}: Error finding container 81ca914cc8757707bf88b5061d4f0283fee1578c60eaf59983c41918a7d4a0dd: Status 404 returned error can't find the container with id 81ca914cc8757707bf88b5061d4f0283fee1578c60eaf59983c41918a7d4a0dd Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.020399 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerStarted","Data":"81ca914cc8757707bf88b5061d4f0283fee1578c60eaf59983c41918a7d4a0dd"} Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.021186 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"adbb82a2-c30f-4e59-be9c-9274739caf25","Type":"ContainerStarted","Data":"325839442bc8df3e31a435954a24759335354b83abaa86e765d8eb186684d343"} Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.023389 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerStarted","Data":"a02a93d29467d0aef12905fadb2d077c02f88559bbd115e1471a4ef683fcfbfa"} Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.024405 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a3fa7901-a49c-433f-942c-a875c9ecd2ab","Type":"ContainerStarted","Data":"95f5935d96a8c5b45bce523cac7ba133bb5e32965428ded95892e86215cbe4b0"} Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.190218 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.201391 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.207743 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.323980 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: W1202 23:14:43.358761 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab22df5_5c0a_42c6_a881_4529dd331e5f.slice/crio-788941fb066155fb358edd688fd2b9ecd40d14cb737d7883e0ad72344a0405c8 WatchSource:0}: Error finding container 788941fb066155fb358edd688fd2b9ecd40d14cb737d7883e0ad72344a0405c8: Status 404 returned error can't find the container with id 788941fb066155fb358edd688fd2b9ecd40d14cb737d7883e0ad72344a0405c8 Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.571443 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.599002 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.609897 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78"] Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.617074 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644325 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd7p8\" (UniqueName: \"kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8\") pod \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644369 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc\") pod \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644437 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc4xv\" (UniqueName: \"kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv\") pod \"760b3fce-f12e-425b-8039-2bd47f1e8651\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644492 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config\") pod \"760b3fce-f12e-425b-8039-2bd47f1e8651\" (UID: \"760b3fce-f12e-425b-8039-2bd47f1e8651\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644512 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config\") pod \"8f625744-98dc-464f-896f-d9cdf237751e\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644549 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc\") pod \"8f625744-98dc-464f-896f-d9cdf237751e\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644581 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config\") pod \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\" (UID: \"5e3f2c63-d702-499c-843a-4dac4bc26ee4\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644627 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmqrx\" (UniqueName: \"kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx\") pod \"8f625744-98dc-464f-896f-d9cdf237751e\" (UID: \"8f625744-98dc-464f-896f-d9cdf237751e\") " Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.644940 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e3f2c63-d702-499c-843a-4dac4bc26ee4" (UID: "5e3f2c63-d702-499c-843a-4dac4bc26ee4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.645154 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config" (OuterVolumeSpecName: "config") pod "8f625744-98dc-464f-896f-d9cdf237751e" (UID: "8f625744-98dc-464f-896f-d9cdf237751e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.645628 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config" (OuterVolumeSpecName: "config") pod "760b3fce-f12e-425b-8039-2bd47f1e8651" (UID: "760b3fce-f12e-425b-8039-2bd47f1e8651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.645786 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f625744-98dc-464f-896f-d9cdf237751e" (UID: "8f625744-98dc-464f-896f-d9cdf237751e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.646594 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config" (OuterVolumeSpecName: "config") pod "5e3f2c63-d702-499c-843a-4dac4bc26ee4" (UID: "5e3f2c63-d702-499c-843a-4dac4bc26ee4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.653127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8" (OuterVolumeSpecName: "kube-api-access-hd7p8") pod "5e3f2c63-d702-499c-843a-4dac4bc26ee4" (UID: "5e3f2c63-d702-499c-843a-4dac4bc26ee4"). InnerVolumeSpecName "kube-api-access-hd7p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.653166 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx" (OuterVolumeSpecName: "kube-api-access-hmqrx") pod "8f625744-98dc-464f-896f-d9cdf237751e" (UID: "8f625744-98dc-464f-896f-d9cdf237751e"). InnerVolumeSpecName "kube-api-access-hmqrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.653185 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv" (OuterVolumeSpecName: "kube-api-access-kc4xv") pod "760b3fce-f12e-425b-8039-2bd47f1e8651" (UID: "760b3fce-f12e-425b-8039-2bd47f1e8651"). InnerVolumeSpecName "kube-api-access-kc4xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.674092 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.675055 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 23:14:43 crc kubenswrapper[4903]: W1202 23:14:43.689414 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd72fba58_af32_4b1a_a883_4e76ec6dc3f4.slice/crio-288dae4f76b8551944f8246fa27b0235faa66e61e441508eedca2af00c2c4770 WatchSource:0}: Error finding container 288dae4f76b8551944f8246fa27b0235faa66e61e441508eedca2af00c2c4770: Status 404 returned error can't find the container with id 288dae4f76b8551944f8246fa27b0235faa66e61e441508eedca2af00c2c4770 Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.746963 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.746993 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmqrx\" (UniqueName: \"kubernetes.io/projected/8f625744-98dc-464f-896f-d9cdf237751e-kube-api-access-hmqrx\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747005 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd7p8\" (UniqueName: \"kubernetes.io/projected/5e3f2c63-d702-499c-843a-4dac4bc26ee4-kube-api-access-hd7p8\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747013 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e3f2c63-d702-499c-843a-4dac4bc26ee4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747022 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc4xv\" (UniqueName: \"kubernetes.io/projected/760b3fce-f12e-425b-8039-2bd47f1e8651-kube-api-access-kc4xv\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747031 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/760b3fce-f12e-425b-8039-2bd47f1e8651-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747039 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.747048 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f625744-98dc-464f-896f-d9cdf237751e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:14:43 crc kubenswrapper[4903]: I1202 23:14:43.788763 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cs6mk"] Dec 02 23:14:43 crc kubenswrapper[4903]: W1202 23:14:43.797012 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf72d79d2_cc88_4d82_abb4_c24c823532cb.slice/crio-830304d843c070d6f5121ce6a744b54e307b12baf42517d5675fb6401dc420d3 WatchSource:0}: Error finding container 830304d843c070d6f5121ce6a744b54e307b12baf42517d5675fb6401dc420d3: Status 404 returned error can't find the container with id 830304d843c070d6f5121ce6a744b54e307b12baf42517d5675fb6401dc420d3 Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.032842 4903 generic.go:334] "Generic (PLEG): container finished" podID="19d3be82-34ff-443d-804a-61dec2277259" containerID="4ca39b7f2ef69cac9e2ed29f956b06f6bac7985a7657297815619ddfc2988ca3" exitCode=0 Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.032932 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" event={"ID":"19d3be82-34ff-443d-804a-61dec2277259","Type":"ContainerDied","Data":"4ca39b7f2ef69cac9e2ed29f956b06f6bac7985a7657297815619ddfc2988ca3"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.035278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4fdb728a-100d-425d-b83c-245c770afa4b","Type":"ContainerStarted","Data":"4af814d13883420e0529604d3eb586c21173d4eff590126776cf1d386b358fb8"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.037220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6eaac3fd-8033-42cd-90c3-5dfac716ae66","Type":"ContainerStarted","Data":"664edcc6ac31b11b5d359490bd7187f00b7e3753359fdcdf70a1bc7d14c49bc3"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.038168 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2372c82e-7656-4307-946c-155ec0d8cb3d","Type":"ContainerStarted","Data":"ccb7866320f01c6bda0137bc67a76d6abee63a337e9d30ca81581db00ba51d31"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.039522 4903 generic.go:334] "Generic (PLEG): container finished" podID="34aa4a0b-387e-41eb-a104-c08208065d85" containerID="141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142" exitCode=0 Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.039560 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" event={"ID":"34aa4a0b-387e-41eb-a104-c08208065d85","Type":"ContainerDied","Data":"141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.040366 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4ab22df5-5c0a-42c6-a881-4529dd331e5f","Type":"ContainerStarted","Data":"788941fb066155fb358edd688fd2b9ecd40d14cb737d7883e0ad72344a0405c8"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.041423 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78" event={"ID":"d72fba58-af32-4b1a-a883-4e76ec6dc3f4","Type":"ContainerStarted","Data":"288dae4f76b8551944f8246fa27b0235faa66e61e441508eedca2af00c2c4770"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.042243 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" event={"ID":"8f625744-98dc-464f-896f-d9cdf237751e","Type":"ContainerDied","Data":"2f801adee6186b627cfc918740b1062fa7ba342bdfcbf726457784b1e5997819"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.042302 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df5fc89fc-vv75w" Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.043821 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c98497-bsvrh" event={"ID":"5e3f2c63-d702-499c-843a-4dac4bc26ee4","Type":"ContainerDied","Data":"e8300c8d0a4574d99a74426641a078048882702d005774b72d5c486254ce54d4"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.043837 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c98497-bsvrh" Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.044964 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cs6mk" event={"ID":"f72d79d2-cc88-4d82-abb4-c24c823532cb","Type":"ContainerStarted","Data":"830304d843c070d6f5121ce6a744b54e307b12baf42517d5675fb6401dc420d3"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.046867 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerStarted","Data":"45de5075c0cb1935854cf147cab7f1ec730b37d308b1da78c7c5cc58ee5a1d65"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.047817 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"635dddd5-1a09-4f9e-b82f-e45eee76b412","Type":"ContainerStarted","Data":"52ec99c7675277ae8934d9adb1852a848c520ab893d9441a478cbe154d421bfb"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.048914 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" event={"ID":"760b3fce-f12e-425b-8039-2bd47f1e8651","Type":"ContainerDied","Data":"8ed5a7d00174843012de55617a7f8b953260487d95af548ca6e3ed2898b3bc0b"} Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.048942 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588d85ddfc-j2885" Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.108443 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.114296 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-588d85ddfc-j2885"] Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.139565 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.148489 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5df5fc89fc-vv75w"] Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.160560 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:44 crc kubenswrapper[4903]: I1202 23:14:44.166768 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c98497-bsvrh"] Dec 02 23:14:45 crc kubenswrapper[4903]: I1202 23:14:45.621711 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3f2c63-d702-499c-843a-4dac4bc26ee4" path="/var/lib/kubelet/pods/5e3f2c63-d702-499c-843a-4dac4bc26ee4/volumes" Dec 02 23:14:45 crc kubenswrapper[4903]: I1202 23:14:45.622472 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760b3fce-f12e-425b-8039-2bd47f1e8651" path="/var/lib/kubelet/pods/760b3fce-f12e-425b-8039-2bd47f1e8651/volumes" Dec 02 23:14:45 crc kubenswrapper[4903]: I1202 23:14:45.622817 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f625744-98dc-464f-896f-d9cdf237751e" path="/var/lib/kubelet/pods/8f625744-98dc-464f-896f-d9cdf237751e/volumes" Dec 02 23:14:54 crc kubenswrapper[4903]: E1202 23:14:54.185829 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest" Dec 02 23:14:54 crc kubenswrapper[4903]: E1202 23:14:54.186485 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest" Dec 02 23:14:54 crc kubenswrapper[4903]: E1202 23:14:54.186722 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:38.102.83.2:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n696h66ch559h8h595hcfh6bh5c5h5c8h64dh589h684h5b8h5h577h6h687h675h598hb5h55dh595h5b8h5f6h695h659h5f9h68chf8h574h57chd5q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k67db,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(635dddd5-1a09-4f9e-b82f-e45eee76b412): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:14:56 crc kubenswrapper[4903]: I1202 23:14:56.157817 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" event={"ID":"19d3be82-34ff-443d-804a-61dec2277259","Type":"ContainerStarted","Data":"e0833a2a59f736f4ea4ab6952d0c5d4b490ef135befa074e0cec72c85d01fa21"} Dec 02 23:14:56 crc kubenswrapper[4903]: I1202 23:14:56.158973 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:14:56 crc kubenswrapper[4903]: I1202 23:14:56.185200 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" podStartSLOduration=-9223372006.669601 podStartE2EDuration="30.185173776s" podCreationTimestamp="2025-12-02 23:14:26 +0000 UTC" firstStartedPulling="2025-12-02 23:14:26.995610259 +0000 UTC m=+1005.704164542" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:14:56.175305203 +0000 UTC m=+1034.883859496" watchObservedRunningTime="2025-12-02 23:14:56.185173776 +0000 UTC m=+1034.893728079" Dec 02 23:14:57 crc kubenswrapper[4903]: E1202 23:14:57.291298 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 23:14:57 crc kubenswrapper[4903]: E1202 23:14:57.291814 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 23:14:57 crc kubenswrapper[4903]: E1202 23:14:57.292079 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whl67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(2372c82e-7656-4307-946c-155ec0d8cb3d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 23:14:57 crc kubenswrapper[4903]: E1202 23:14:57.293332 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" Dec 02 23:14:58 crc kubenswrapper[4903]: I1202 23:14:58.177903 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" event={"ID":"34aa4a0b-387e-41eb-a104-c08208065d85","Type":"ContainerStarted","Data":"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d"} Dec 02 23:14:58 crc kubenswrapper[4903]: E1202 23:14:58.181678 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" Dec 02 23:14:58 crc kubenswrapper[4903]: I1202 23:14:58.235589 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" podStartSLOduration=23.238498287 podStartE2EDuration="32.23556129s" podCreationTimestamp="2025-12-02 23:14:26 +0000 UTC" firstStartedPulling="2025-12-02 23:14:33.826189008 +0000 UTC m=+1012.534743301" lastFinishedPulling="2025-12-02 23:14:42.823252021 +0000 UTC m=+1021.531806304" observedRunningTime="2025-12-02 23:14:58.230224609 +0000 UTC m=+1036.938778932" watchObservedRunningTime="2025-12-02 23:14:58.23556129 +0000 UTC m=+1036.944115613" Dec 02 23:14:59 crc kubenswrapper[4903]: I1202 23:14:59.189745 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6eaac3fd-8033-42cd-90c3-5dfac716ae66","Type":"ContainerStarted","Data":"7d701473ea42bdbaaea0b04760cc07684c9628aae4cb64cc4e960545b324ebc4"} Dec 02 23:14:59 crc kubenswrapper[4903]: I1202 23:14:59.194366 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4fdb728a-100d-425d-b83c-245c770afa4b","Type":"ContainerStarted","Data":"b9da9ac86b2db99bdd6c17f26336a676b936e6a5bc639faa31579fdfc05a0351"} Dec 02 23:14:59 crc kubenswrapper[4903]: I1202 23:14:59.194413 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:14:59 crc kubenswrapper[4903]: I1202 23:14:59.194431 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 23:14:59 crc kubenswrapper[4903]: I1202 23:14:59.260472 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.806584906 podStartE2EDuration="28.260443296s" podCreationTimestamp="2025-12-02 23:14:31 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.216150587 +0000 UTC m=+1021.924704860" lastFinishedPulling="2025-12-02 23:14:54.670008967 +0000 UTC m=+1033.378563250" observedRunningTime="2025-12-02 23:14:59.250223627 +0000 UTC m=+1037.958777950" watchObservedRunningTime="2025-12-02 23:14:59.260443296 +0000 UTC m=+1037.968997619" Dec 02 23:14:59 crc kubenswrapper[4903]: E1202 23:14:59.404042 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="635dddd5-1a09-4f9e-b82f-e45eee76b412" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.165843 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.167282 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.173193 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.173409 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.176147 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.203255 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"635dddd5-1a09-4f9e-b82f-e45eee76b412","Type":"ContainerStarted","Data":"52f4caa598a4c7514df3e0c49a72d70a977f440234e19c98dec6691863348974"} Dec 02 23:15:00 crc kubenswrapper[4903]: E1202 23:15:00.204726 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="635dddd5-1a09-4f9e-b82f-e45eee76b412" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.205095 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4ab22df5-5c0a-42c6-a881-4529dd331e5f","Type":"ContainerStarted","Data":"beb072a35ff5b619f797112e49862e1f48d3d35791231114258a1a0ae2271a3d"} Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.206765 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78" event={"ID":"d72fba58-af32-4b1a-a883-4e76ec6dc3f4","Type":"ContainerStarted","Data":"06d61d62e3709baf9df45bdc557adfea2085b5a0570ef007ee7b3ebef6f89caf"} Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.206884 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lkt78" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.208048 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a3fa7901-a49c-433f-942c-a875c9ecd2ab","Type":"ContainerStarted","Data":"27c41fdd3460acda5e3509c2d4a963a4954f9fbea9888033d4c9fc9cf2b1a3a7"} Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.209292 4903 generic.go:334] "Generic (PLEG): container finished" podID="f72d79d2-cc88-4d82-abb4-c24c823532cb" containerID="420e9a574e1ae30c191780b6996a99d1d36ca7ff492edd160501d5b8b31422a4" exitCode=0 Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.209419 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cs6mk" event={"ID":"f72d79d2-cc88-4d82-abb4-c24c823532cb","Type":"ContainerDied","Data":"420e9a574e1ae30c191780b6996a99d1d36ca7ff492edd160501d5b8b31422a4"} Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.258517 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.258626 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.258664 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwpp7\" (UniqueName: \"kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.267492 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lkt78" podStartSLOduration=12.841879433999999 podStartE2EDuration="24.267465497s" podCreationTimestamp="2025-12-02 23:14:36 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.691614926 +0000 UTC m=+1022.400169209" lastFinishedPulling="2025-12-02 23:14:55.117200989 +0000 UTC m=+1033.825755272" observedRunningTime="2025-12-02 23:15:00.261244435 +0000 UTC m=+1038.969798718" watchObservedRunningTime="2025-12-02 23:15:00.267465497 +0000 UTC m=+1038.976019780" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.362612 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.362690 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwpp7\" (UniqueName: \"kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.362960 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.366675 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.391827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.404715 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwpp7\" (UniqueName: \"kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7\") pod \"collect-profiles-29411955-phxmm\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.404791 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9fmcf"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.405830 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.411990 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.419456 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9fmcf"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.518098 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.560906 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.561113 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="dnsmasq-dns" containerID="cri-o://e0833a2a59f736f4ea4ab6952d0c5d4b490ef135befa074e0cec72c85d01fa21" gracePeriod=10 Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565447 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovn-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565500 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-combined-ca-bundle\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a25811-66de-4b62-ad27-f01f63f539a1-config\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565613 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jt64\" (UniqueName: \"kubernetes.io/projected/d7a25811-66de-4b62-ad27-f01f63f539a1-kube-api-access-7jt64\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565707 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovs-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.565741 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.568670 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.610911 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.612258 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.614561 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.651808 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674266 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-combined-ca-bundle\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a25811-66de-4b62-ad27-f01f63f539a1-config\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674375 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jt64\" (UniqueName: \"kubernetes.io/projected/d7a25811-66de-4b62-ad27-f01f63f539a1-kube-api-access-7jt64\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674403 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovs-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674426 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674529 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovn-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.674838 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovn-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.679996 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a25811-66de-4b62-ad27-f01f63f539a1-config\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.680073 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7a25811-66de-4b62-ad27-f01f63f539a1-ovs-rundir\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.690382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-combined-ca-bundle\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.720455 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.774895 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.775969 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.776014 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.776066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzwx7\" (UniqueName: \"kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.776097 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.776172 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.780902 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.849059 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.877805 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.877933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrx7x\" (UniqueName: \"kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.878025 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.878044 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.878183 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.878208 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.878960 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.879030 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.879607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzwx7\" (UniqueName: \"kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.879933 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.879527 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.880489 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.896137 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzwx7\" (UniqueName: \"kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7\") pod \"dnsmasq-dns-84ccf854ff-lwk9b\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.944036 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.946449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a25811-66de-4b62-ad27-f01f63f539a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.948689 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jt64\" (UniqueName: \"kubernetes.io/projected/d7a25811-66de-4b62-ad27-f01f63f539a1-kube-api-access-7jt64\") pod \"ovn-controller-metrics-9fmcf\" (UID: \"d7a25811-66de-4b62-ad27-f01f63f539a1\") " pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.986270 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.986325 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrx7x\" (UniqueName: \"kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.986363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.986380 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.986450 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.987388 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.988216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.988462 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:00 crc kubenswrapper[4903]: I1202 23:15:00.989066 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.036025 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrx7x\" (UniqueName: \"kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x\") pod \"dnsmasq-dns-85cb4fb747-l8slh\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.125928 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9fmcf" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.130317 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.239563 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerStarted","Data":"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.244208 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4ab22df5-5c0a-42c6-a881-4529dd331e5f","Type":"ContainerStarted","Data":"03eaff799d2771c70f43a432de018b957e399ad1eb9d476a8f0a036b8484ef09"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.255705 4903 generic.go:334] "Generic (PLEG): container finished" podID="19d3be82-34ff-443d-804a-61dec2277259" containerID="e0833a2a59f736f4ea4ab6952d0c5d4b490ef135befa074e0cec72c85d01fa21" exitCode=0 Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.255769 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" event={"ID":"19d3be82-34ff-443d-804a-61dec2277259","Type":"ContainerDied","Data":"e0833a2a59f736f4ea4ab6952d0c5d4b490ef135befa074e0cec72c85d01fa21"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.264760 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerStarted","Data":"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.305769 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.157508545 podStartE2EDuration="22.305752419s" podCreationTimestamp="2025-12-02 23:14:39 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.36055367 +0000 UTC m=+1022.069107953" lastFinishedPulling="2025-12-02 23:14:55.508797534 +0000 UTC m=+1034.217351827" observedRunningTime="2025-12-02 23:15:01.300203284 +0000 UTC m=+1040.008757587" watchObservedRunningTime="2025-12-02 23:15:01.305752419 +0000 UTC m=+1040.014306702" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.327109 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cs6mk" event={"ID":"f72d79d2-cc88-4d82-abb4-c24c823532cb","Type":"ContainerStarted","Data":"235b1faade1db47685296922317d6424b824863a212c8cb8d5e0a7898676ccf1"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.344818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerStarted","Data":"d04b4c2f88b4b14dbd2841fdde15eed0a268d87c661116dbd9409f753cb7f7b7"} Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.358790 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="dnsmasq-dns" containerID="cri-o://a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d" gracePeriod=10 Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.358882 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"adbb82a2-c30f-4e59-be9c-9274739caf25","Type":"ContainerStarted","Data":"84be4fb0d0225a528b60ed67e1700242f46cd7db3e750ec446115b074ddcb25f"} Dec 02 23:15:01 crc kubenswrapper[4903]: E1202 23:15:01.358931 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="635dddd5-1a09-4f9e-b82f-e45eee76b412" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.477027 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm"] Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.554710 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.606980 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qmc\" (UniqueName: \"kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc\") pod \"19d3be82-34ff-443d-804a-61dec2277259\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.607283 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc\") pod \"19d3be82-34ff-443d-804a-61dec2277259\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.607491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config\") pod \"19d3be82-34ff-443d-804a-61dec2277259\" (UID: \"19d3be82-34ff-443d-804a-61dec2277259\") " Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.613475 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc" (OuterVolumeSpecName: "kube-api-access-j7qmc") pod "19d3be82-34ff-443d-804a-61dec2277259" (UID: "19d3be82-34ff-443d-804a-61dec2277259"). InnerVolumeSpecName "kube-api-access-j7qmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.697037 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.703997 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9fmcf"] Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.709207 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qmc\" (UniqueName: \"kubernetes.io/projected/19d3be82-34ff-443d-804a-61dec2277259-kube-api-access-j7qmc\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.763302 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config" (OuterVolumeSpecName: "config") pod "19d3be82-34ff-443d-804a-61dec2277259" (UID: "19d3be82-34ff-443d-804a-61dec2277259"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.769734 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19d3be82-34ff-443d-804a-61dec2277259" (UID: "19d3be82-34ff-443d-804a-61dec2277259"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.810681 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.810708 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d3be82-34ff-443d-804a-61dec2277259-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:01 crc kubenswrapper[4903]: I1202 23:15:01.847638 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:01 crc kubenswrapper[4903]: W1202 23:15:01.858126 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e1c0de_8117_4f51_8403_c0d56fcd4fa3.slice/crio-47b9523324a329b9ff14bc4c5d4d8885b2f2e86c0748a7e354c43bb144b9582c WatchSource:0}: Error finding container 47b9523324a329b9ff14bc4c5d4d8885b2f2e86c0748a7e354c43bb144b9582c: Status 404 returned error can't find the container with id 47b9523324a329b9ff14bc4c5d4d8885b2f2e86c0748a7e354c43bb144b9582c Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.079075 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.089081 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.142609 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.243954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config\") pod \"34aa4a0b-387e-41eb-a104-c08208065d85\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.243997 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5n64\" (UniqueName: \"kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64\") pod \"34aa4a0b-387e-41eb-a104-c08208065d85\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.244043 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc\") pod \"34aa4a0b-387e-41eb-a104-c08208065d85\" (UID: \"34aa4a0b-387e-41eb-a104-c08208065d85\") " Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.248767 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64" (OuterVolumeSpecName: "kube-api-access-f5n64") pod "34aa4a0b-387e-41eb-a104-c08208065d85" (UID: "34aa4a0b-387e-41eb-a104-c08208065d85"). InnerVolumeSpecName "kube-api-access-f5n64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.281618 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config" (OuterVolumeSpecName: "config") pod "34aa4a0b-387e-41eb-a104-c08208065d85" (UID: "34aa4a0b-387e-41eb-a104-c08208065d85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.295622 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34aa4a0b-387e-41eb-a104-c08208065d85" (UID: "34aa4a0b-387e-41eb-a104-c08208065d85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.347018 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.347610 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa4a0b-387e-41eb-a104-c08208065d85-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.347633 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5n64\" (UniqueName: \"kubernetes.io/projected/34aa4a0b-387e-41eb-a104-c08208065d85-kube-api-access-f5n64\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.368638 4903 generic.go:334] "Generic (PLEG): container finished" podID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerID="430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077" exitCode=0 Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.368735 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" event={"ID":"4c54bcb1-cb1f-441d-959a-1dd78857a31f","Type":"ContainerDied","Data":"430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.368798 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" event={"ID":"4c54bcb1-cb1f-441d-959a-1dd78857a31f","Type":"ContainerStarted","Data":"b7b2db52d8a9401867c3612076fedfb831c165fb692d1549bb4336ee6e72312c"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.372272 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9fmcf" event={"ID":"d7a25811-66de-4b62-ad27-f01f63f539a1","Type":"ContainerStarted","Data":"f234d319ebb4c9cf0db17d169e5fdee32ef3aae7ab2e34d220eca7d8ff792aba"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.372312 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9fmcf" event={"ID":"d7a25811-66de-4b62-ad27-f01f63f539a1","Type":"ContainerStarted","Data":"1da8b377bdd409e3d33ec8dd7e37dee8f2a4c7f3abcadb7a8a56a0b31fd8e3af"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.381704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cs6mk" event={"ID":"f72d79d2-cc88-4d82-abb4-c24c823532cb","Type":"ContainerStarted","Data":"66e5d82d39bd4b489dc0816ef74fe952dbb6b6a5140213f70273ef2e09577d90"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.381811 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.381945 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.385426 4903 generic.go:334] "Generic (PLEG): container finished" podID="34aa4a0b-387e-41eb-a104-c08208065d85" containerID="a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d" exitCode=0 Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.385579 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" event={"ID":"34aa4a0b-387e-41eb-a104-c08208065d85","Type":"ContainerDied","Data":"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.385599 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.385621 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f97ccc87-drwm2" event={"ID":"34aa4a0b-387e-41eb-a104-c08208065d85","Type":"ContainerDied","Data":"9f056f15278d0daecb1091aea6a6f63937acdb683440f5a1cea4f31e71b4f017"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.385666 4903 scope.go:117] "RemoveContainer" containerID="a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.393911 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.393771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65885745f9-jm2v7" event={"ID":"19d3be82-34ff-443d-804a-61dec2277259","Type":"ContainerDied","Data":"a255543f8041ff9dde7ea414dd3d634ae4fd04c6e30a5eef8524208c21fb1281"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.397350 4903 generic.go:334] "Generic (PLEG): container finished" podID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerID="9be57891e0d9105f6536799654cb3b4764356f1de9f8732ac0815651929436f6" exitCode=0 Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.397453 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" event={"ID":"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3","Type":"ContainerDied","Data":"9be57891e0d9105f6536799654cb3b4764356f1de9f8732ac0815651929436f6"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.397487 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" event={"ID":"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3","Type":"ContainerStarted","Data":"47b9523324a329b9ff14bc4c5d4d8885b2f2e86c0748a7e354c43bb144b9582c"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.403953 4903 generic.go:334] "Generic (PLEG): container finished" podID="a754468c-293c-4429-bbcf-3ecd9d1a87ee" containerID="edb9fe5ba3f738d56f330d22bafef890b67b04bcfcc67bee024fa0a03b655ff6" exitCode=0 Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.404254 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" event={"ID":"a754468c-293c-4429-bbcf-3ecd9d1a87ee","Type":"ContainerDied","Data":"edb9fe5ba3f738d56f330d22bafef890b67b04bcfcc67bee024fa0a03b655ff6"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.404324 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" event={"ID":"a754468c-293c-4429-bbcf-3ecd9d1a87ee","Type":"ContainerStarted","Data":"b364919a1d281fdf113bfc67efed6201c1b810173ab541d3801cbbcebc581607"} Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.404716 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.435288 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9fmcf" podStartSLOduration=2.435261689 podStartE2EDuration="2.435261689s" podCreationTimestamp="2025-12-02 23:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:02.410134036 +0000 UTC m=+1041.118688339" watchObservedRunningTime="2025-12-02 23:15:02.435261689 +0000 UTC m=+1041.143815972" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.467461 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cs6mk" podStartSLOduration=15.151255822 podStartE2EDuration="26.467415723s" podCreationTimestamp="2025-12-02 23:14:36 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.800902645 +0000 UTC m=+1022.509456928" lastFinishedPulling="2025-12-02 23:14:55.117062536 +0000 UTC m=+1033.825616829" observedRunningTime="2025-12-02 23:15:02.452642962 +0000 UTC m=+1041.161197245" watchObservedRunningTime="2025-12-02 23:15:02.467415723 +0000 UTC m=+1041.175970006" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.548870 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.558273 4903 scope.go:117] "RemoveContainer" containerID="141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.566091 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f97ccc87-drwm2"] Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.571891 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.578680 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65885745f9-jm2v7"] Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.582426 4903 scope.go:117] "RemoveContainer" containerID="a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d" Dec 02 23:15:02 crc kubenswrapper[4903]: E1202 23:15:02.583907 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d\": container with ID starting with a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d not found: ID does not exist" containerID="a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.583935 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d"} err="failed to get container status \"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d\": rpc error: code = NotFound desc = could not find container \"a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d\": container with ID starting with a71e37035a205a07413082706837782a85a6095f4f5015f790df03b2c3b60d5d not found: ID does not exist" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.583957 4903 scope.go:117] "RemoveContainer" containerID="141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142" Dec 02 23:15:02 crc kubenswrapper[4903]: E1202 23:15:02.585336 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142\": container with ID starting with 141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142 not found: ID does not exist" containerID="141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.585469 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142"} err="failed to get container status \"141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142\": rpc error: code = NotFound desc = could not find container \"141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142\": container with ID starting with 141df112c010b010d6ea9489f023bec2559d66fe76270c431e79f5e4678dd142 not found: ID does not exist" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.585551 4903 scope.go:117] "RemoveContainer" containerID="e0833a2a59f736f4ea4ab6952d0c5d4b490ef135befa074e0cec72c85d01fa21" Dec 02 23:15:02 crc kubenswrapper[4903]: I1202 23:15:02.616487 4903 scope.go:117] "RemoveContainer" containerID="4ca39b7f2ef69cac9e2ed29f956b06f6bac7985a7657297815619ddfc2988ca3" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.419282 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" event={"ID":"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3","Type":"ContainerStarted","Data":"00e1d76c64567fb8fc1c376750919e2235f0bbf1302278418138cdc1935355e5"} Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.419630 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.422304 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" event={"ID":"4c54bcb1-cb1f-441d-959a-1dd78857a31f","Type":"ContainerStarted","Data":"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24"} Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.422456 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.461586 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" podStartSLOduration=3.461553699 podStartE2EDuration="3.461553699s" podCreationTimestamp="2025-12-02 23:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:03.446753497 +0000 UTC m=+1042.155307770" watchObservedRunningTime="2025-12-02 23:15:03.461553699 +0000 UTC m=+1042.170108022" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.473817 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" podStartSLOduration=3.473800868 podStartE2EDuration="3.473800868s" podCreationTimestamp="2025-12-02 23:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:03.470092057 +0000 UTC m=+1042.178646340" watchObservedRunningTime="2025-12-02 23:15:03.473800868 +0000 UTC m=+1042.182355151" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.633844 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d3be82-34ff-443d-804a-61dec2277259" path="/var/lib/kubelet/pods/19d3be82-34ff-443d-804a-61dec2277259/volumes" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.635224 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" path="/var/lib/kubelet/pods/34aa4a0b-387e-41eb-a104-c08208065d85/volumes" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.781944 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.886491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume\") pod \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.886567 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwpp7\" (UniqueName: \"kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7\") pod \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.886745 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume\") pod \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\" (UID: \"a754468c-293c-4429-bbcf-3ecd9d1a87ee\") " Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.887304 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "a754468c-293c-4429-bbcf-3ecd9d1a87ee" (UID: "a754468c-293c-4429-bbcf-3ecd9d1a87ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.891438 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a754468c-293c-4429-bbcf-3ecd9d1a87ee" (UID: "a754468c-293c-4429-bbcf-3ecd9d1a87ee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.891775 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7" (OuterVolumeSpecName: "kube-api-access-xwpp7") pod "a754468c-293c-4429-bbcf-3ecd9d1a87ee" (UID: "a754468c-293c-4429-bbcf-3ecd9d1a87ee"). InnerVolumeSpecName "kube-api-access-xwpp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.988316 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a754468c-293c-4429-bbcf-3ecd9d1a87ee-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.988355 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a754468c-293c-4429-bbcf-3ecd9d1a87ee-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:03 crc kubenswrapper[4903]: I1202 23:15:03.988369 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwpp7\" (UniqueName: \"kubernetes.io/projected/a754468c-293c-4429-bbcf-3ecd9d1a87ee-kube-api-access-xwpp7\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:04 crc kubenswrapper[4903]: I1202 23:15:04.437301 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" event={"ID":"a754468c-293c-4429-bbcf-3ecd9d1a87ee","Type":"ContainerDied","Data":"b364919a1d281fdf113bfc67efed6201c1b810173ab541d3801cbbcebc581607"} Dec 02 23:15:04 crc kubenswrapper[4903]: I1202 23:15:04.437389 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b364919a1d281fdf113bfc67efed6201c1b810173ab541d3801cbbcebc581607" Dec 02 23:15:04 crc kubenswrapper[4903]: I1202 23:15:04.437495 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm" Dec 02 23:15:05 crc kubenswrapper[4903]: I1202 23:15:05.465844 4903 generic.go:334] "Generic (PLEG): container finished" podID="6eaac3fd-8033-42cd-90c3-5dfac716ae66" containerID="7d701473ea42bdbaaea0b04760cc07684c9628aae4cb64cc4e960545b324ebc4" exitCode=0 Dec 02 23:15:05 crc kubenswrapper[4903]: I1202 23:15:05.465962 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6eaac3fd-8033-42cd-90c3-5dfac716ae66","Type":"ContainerDied","Data":"7d701473ea42bdbaaea0b04760cc07684c9628aae4cb64cc4e960545b324ebc4"} Dec 02 23:15:05 crc kubenswrapper[4903]: I1202 23:15:05.470276 4903 generic.go:334] "Generic (PLEG): container finished" podID="a3fa7901-a49c-433f-942c-a875c9ecd2ab" containerID="27c41fdd3460acda5e3509c2d4a963a4954f9fbea9888033d4c9fc9cf2b1a3a7" exitCode=0 Dec 02 23:15:05 crc kubenswrapper[4903]: I1202 23:15:05.470326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a3fa7901-a49c-433f-942c-a875c9ecd2ab","Type":"ContainerDied","Data":"27c41fdd3460acda5e3509c2d4a963a4954f9fbea9888033d4c9fc9cf2b1a3a7"} Dec 02 23:15:06 crc kubenswrapper[4903]: I1202 23:15:06.140150 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 23:15:06 crc kubenswrapper[4903]: I1202 23:15:06.479711 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a3fa7901-a49c-433f-942c-a875c9ecd2ab","Type":"ContainerStarted","Data":"d4281b8243e9ff56ccf9f1632f2141495d60ad173258a7f243b158f17074876d"} Dec 02 23:15:06 crc kubenswrapper[4903]: I1202 23:15:06.481955 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6eaac3fd-8033-42cd-90c3-5dfac716ae66","Type":"ContainerStarted","Data":"792ae545dc61f3073406a39758379631fbb570359a940f7b60258070025d4b20"} Dec 02 23:15:06 crc kubenswrapper[4903]: I1202 23:15:06.513563 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.654247137 podStartE2EDuration="37.513540985s" podCreationTimestamp="2025-12-02 23:14:29 +0000 UTC" firstStartedPulling="2025-12-02 23:14:42.649579708 +0000 UTC m=+1021.358133991" lastFinishedPulling="2025-12-02 23:14:55.508873546 +0000 UTC m=+1034.217427839" observedRunningTime="2025-12-02 23:15:06.506207705 +0000 UTC m=+1045.214761998" watchObservedRunningTime="2025-12-02 23:15:06.513540985 +0000 UTC m=+1045.222095268" Dec 02 23:15:06 crc kubenswrapper[4903]: I1202 23:15:06.541119 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.654328465 podStartE2EDuration="38.541096516s" podCreationTimestamp="2025-12-02 23:14:28 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.228622384 +0000 UTC m=+1021.937176667" lastFinishedPulling="2025-12-02 23:14:55.115390435 +0000 UTC m=+1033.823944718" observedRunningTime="2025-12-02 23:15:06.534888705 +0000 UTC m=+1045.243442998" watchObservedRunningTime="2025-12-02 23:15:06.541096516 +0000 UTC m=+1045.249650839" Dec 02 23:15:07 crc kubenswrapper[4903]: I1202 23:15:07.063615 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 23:15:08 crc kubenswrapper[4903]: I1202 23:15:08.506095 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerID="d04b4c2f88b4b14dbd2841fdde15eed0a268d87c661116dbd9409f753cb7f7b7" exitCode=0 Dec 02 23:15:08 crc kubenswrapper[4903]: I1202 23:15:08.506202 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerDied","Data":"d04b4c2f88b4b14dbd2841fdde15eed0a268d87c661116dbd9409f753cb7f7b7"} Dec 02 23:15:09 crc kubenswrapper[4903]: I1202 23:15:09.822213 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 23:15:09 crc kubenswrapper[4903]: I1202 23:15:09.822272 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 23:15:10 crc kubenswrapper[4903]: I1202 23:15:10.523467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2372c82e-7656-4307-946c-155ec0d8cb3d","Type":"ContainerStarted","Data":"b4ab52ad181c2185f46d97c6f90fcf70aee227ad20e6a21ab39a6eef31041ca6"} Dec 02 23:15:10 crc kubenswrapper[4903]: I1202 23:15:10.523941 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 23:15:10 crc kubenswrapper[4903]: I1202 23:15:10.539872 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.767844148 podStartE2EDuration="37.539853563s" podCreationTimestamp="2025-12-02 23:14:33 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.212740304 +0000 UTC m=+1021.921294587" lastFinishedPulling="2025-12-02 23:15:09.984749709 +0000 UTC m=+1048.693304002" observedRunningTime="2025-12-02 23:15:10.537028154 +0000 UTC m=+1049.245582447" watchObservedRunningTime="2025-12-02 23:15:10.539853563 +0000 UTC m=+1049.248407846" Dec 02 23:15:10 crc kubenswrapper[4903]: I1202 23:15:10.946980 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:11 crc kubenswrapper[4903]: I1202 23:15:11.131843 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:11 crc kubenswrapper[4903]: I1202 23:15:11.195009 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:11 crc kubenswrapper[4903]: I1202 23:15:11.532005 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="dnsmasq-dns" containerID="cri-o://017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24" gracePeriod=10 Dec 02 23:15:11 crc kubenswrapper[4903]: I1202 23:15:11.867859 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 23:15:11 crc kubenswrapper[4903]: I1202 23:15:11.868808 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.013213 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.032637 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.132806 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzwx7\" (UniqueName: \"kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7\") pod \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.132882 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb\") pod \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.132952 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc\") pod \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.133025 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config\") pod \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\" (UID: \"4c54bcb1-cb1f-441d-959a-1dd78857a31f\") " Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.139424 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7" (OuterVolumeSpecName: "kube-api-access-bzwx7") pod "4c54bcb1-cb1f-441d-959a-1dd78857a31f" (UID: "4c54bcb1-cb1f-441d-959a-1dd78857a31f"). InnerVolumeSpecName "kube-api-access-bzwx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.145233 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.182675 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c54bcb1-cb1f-441d-959a-1dd78857a31f" (UID: "4c54bcb1-cb1f-441d-959a-1dd78857a31f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.189447 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config" (OuterVolumeSpecName: "config") pod "4c54bcb1-cb1f-441d-959a-1dd78857a31f" (UID: "4c54bcb1-cb1f-441d-959a-1dd78857a31f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.209002 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c54bcb1-cb1f-441d-959a-1dd78857a31f" (UID: "4c54bcb1-cb1f-441d-959a-1dd78857a31f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.236094 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzwx7\" (UniqueName: \"kubernetes.io/projected/4c54bcb1-cb1f-441d-959a-1dd78857a31f-kube-api-access-bzwx7\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.236122 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.236132 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.236142 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c54bcb1-cb1f-441d-959a-1dd78857a31f-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.539339 4903 generic.go:334] "Generic (PLEG): container finished" podID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerID="017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24" exitCode=0 Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.539401 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" event={"ID":"4c54bcb1-cb1f-441d-959a-1dd78857a31f","Type":"ContainerDied","Data":"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24"} Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.539482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" event={"ID":"4c54bcb1-cb1f-441d-959a-1dd78857a31f","Type":"ContainerDied","Data":"b7b2db52d8a9401867c3612076fedfb831c165fb692d1549bb4336ee6e72312c"} Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.539504 4903 scope.go:117] "RemoveContainer" containerID="017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.539435 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccf854ff-lwk9b" Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.570107 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:12 crc kubenswrapper[4903]: I1202 23:15:12.575073 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84ccf854ff-lwk9b"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.439049 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441105 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441195 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441289 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441352 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441438 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441494 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441551 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441606 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441705 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441770 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441841 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.441900 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="init" Dec 02 23:15:13 crc kubenswrapper[4903]: E1202 23:15:13.441972 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a754468c-293c-4429-bbcf-3ecd9d1a87ee" containerName="collect-profiles" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.442027 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a754468c-293c-4429-bbcf-3ecd9d1a87ee" containerName="collect-profiles" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.442259 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aa4a0b-387e-41eb-a104-c08208065d85" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.442333 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d3be82-34ff-443d-804a-61dec2277259" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.442400 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a754468c-293c-4429-bbcf-3ecd9d1a87ee" containerName="collect-profiles" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.442461 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" containerName="dnsmasq-dns" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.443814 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.460347 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.563417 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.563487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgs5q\" (UniqueName: \"kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.563516 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.563553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.563614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.627097 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c54bcb1-cb1f-441d-959a-1dd78857a31f" path="/var/lib/kubelet/pods/4c54bcb1-cb1f-441d-959a-1dd78857a31f/volumes" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.658946 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-mck9h"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.660466 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.664554 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-6742-account-create-update-8snbz"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665637 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665780 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgs5q\" (UniqueName: \"kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665859 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665884 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665895 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.665938 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.666906 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.667449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.667464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.667513 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.669831 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.676720 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-6742-account-create-update-8snbz"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.689262 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgs5q\" (UniqueName: \"kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q\") pod \"dnsmasq-dns-554c7689cc-ngxvh\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.697164 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mck9h"] Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.762002 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.767947 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.768005 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgs8\" (UniqueName: \"kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.768058 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg7z\" (UniqueName: \"kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.768080 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.870256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.870339 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgs8\" (UniqueName: \"kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.870376 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg7z\" (UniqueName: \"kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.870411 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.871478 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.872188 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.890616 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgs8\" (UniqueName: \"kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8\") pod \"watcher-db-create-mck9h\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.904803 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg7z\" (UniqueName: \"kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z\") pod \"watcher-6742-account-create-update-8snbz\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:13 crc kubenswrapper[4903]: I1202 23:15:13.979110 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.023583 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.066870 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.193569 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.542047 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.548404 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.552549 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.552934 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.552970 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h2m7p" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.552979 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.579638 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.584676 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.584744 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-cache\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.584933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.585042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-lock\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.585104 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg55q\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-kube-api-access-sg55q\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.691103 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-cache\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.691226 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.691304 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-lock\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.691355 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg55q\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-kube-api-access-sg55q\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.691567 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.692904 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-cache\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.693724 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: E1202 23:15:14.697923 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:14 crc kubenswrapper[4903]: E1202 23:15:14.698031 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:14 crc kubenswrapper[4903]: E1202 23:15:14.698164 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:15.198141033 +0000 UTC m=+1053.906695316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.698355 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/55e0eb4b-69b7-4845-84aa-77dae4384f32-lock\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.720759 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg55q\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-kube-api-access-sg55q\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:14 crc kubenswrapper[4903]: I1202 23:15:14.736039 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.116006 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gkjrc"] Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.117418 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.124697 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.124976 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.125641 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.133771 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gkjrc"] Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200493 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm8h6\" (UniqueName: \"kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200567 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200628 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200680 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200739 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200777 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200831 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.200907 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: E1202 23:15:15.201095 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:15 crc kubenswrapper[4903]: E1202 23:15:15.201121 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:15 crc kubenswrapper[4903]: E1202 23:15:15.201203 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:16.201150626 +0000 UTC m=+1054.909704909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302355 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302431 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302482 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302507 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm8h6\" (UniqueName: \"kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302539 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302576 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.302596 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.303588 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.306147 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.306433 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.313107 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.313235 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.313495 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.320850 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm8h6\" (UniqueName: \"kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6\") pod \"swift-ring-rebalance-gkjrc\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.463533 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.754116 4903 scope.go:117] "RemoveContainer" containerID="430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.856001 4903 scope.go:117] "RemoveContainer" containerID="017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24" Dec 02 23:15:15 crc kubenswrapper[4903]: E1202 23:15:15.856362 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24\": container with ID starting with 017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24 not found: ID does not exist" containerID="017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.856393 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24"} err="failed to get container status \"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24\": rpc error: code = NotFound desc = could not find container \"017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24\": container with ID starting with 017163d9552d79bb653cb026ad353e1ed889efcddb24998a37769b0f3515ba24 not found: ID does not exist" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.856416 4903 scope.go:117] "RemoveContainer" containerID="430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077" Dec 02 23:15:15 crc kubenswrapper[4903]: E1202 23:15:15.857317 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077\": container with ID starting with 430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077 not found: ID does not exist" containerID="430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077" Dec 02 23:15:15 crc kubenswrapper[4903]: I1202 23:15:15.857365 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077"} err="failed to get container status \"430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077\": rpc error: code = NotFound desc = could not find container \"430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077\": container with ID starting with 430ab0affb66c05e809b294645b8c4b2b2d9fbc8ac9866975e0016ec5ec31077 not found: ID does not exist" Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.064705 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gkjrc"] Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.222295 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:16 crc kubenswrapper[4903]: E1202 23:15:16.222486 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:16 crc kubenswrapper[4903]: E1202 23:15:16.222498 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:16 crc kubenswrapper[4903]: E1202 23:15:16.222537 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:18.222524827 +0000 UTC m=+1056.931079110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.318460 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mck9h"] Dec 02 23:15:16 crc kubenswrapper[4903]: W1202 23:15:16.324040 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod435a1343_d272_407f_9329_a6d1f481a22a.slice/crio-d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792 WatchSource:0}: Error finding container d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792: Status 404 returned error can't find the container with id d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792 Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.377533 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.388307 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-6742-account-create-update-8snbz"] Dec 02 23:15:16 crc kubenswrapper[4903]: W1202 23:15:16.404912 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5621cca_33f2_4de9_a39a_aca977548db7.slice/crio-e9485965e3f291a5d5ae1098dc0a7b4838a8eca96dd519510c954336946dd51c WatchSource:0}: Error finding container e9485965e3f291a5d5ae1098dc0a7b4838a8eca96dd519510c954336946dd51c: Status 404 returned error can't find the container with id e9485965e3f291a5d5ae1098dc0a7b4838a8eca96dd519510c954336946dd51c Dec 02 23:15:16 crc kubenswrapper[4903]: W1202 23:15:16.406569 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebba9302_4b1e_4073_83f4_505b43e2309c.slice/crio-939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a WatchSource:0}: Error finding container 939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a: Status 404 returned error can't find the container with id 939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.577962 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerStarted","Data":"549f03176e1c9cd6246023d31eea53beaf5978f428985bb4496afa494a359b03"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.579695 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6742-account-create-update-8snbz" event={"ID":"ebba9302-4b1e-4073-83f4-505b43e2309c","Type":"ContainerStarted","Data":"939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.581834 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"635dddd5-1a09-4f9e-b82f-e45eee76b412","Type":"ContainerStarted","Data":"94ce72e9bcd0159122346131f315bd5d707334090bd23e152ac0e2e81ed1cbfd"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.587739 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkjrc" event={"ID":"f16a381a-80d3-4a60-be1b-e782dab1c73c","Type":"ContainerStarted","Data":"550f6559425bd2c4b558c9afde4b64c775f0b6f2da774377d44699972d7af21c"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.589734 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" event={"ID":"c5621cca-33f2-4de9-a39a-aca977548db7","Type":"ContainerStarted","Data":"e9485965e3f291a5d5ae1098dc0a7b4838a8eca96dd519510c954336946dd51c"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.596297 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mck9h" event={"ID":"435a1343-d272-407f-9329-a6d1f481a22a","Type":"ContainerStarted","Data":"c21ada723047edb2d29285b3ab0e52418e1b17d40ed45803ee1d7bea6945eaa9"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.596348 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mck9h" event={"ID":"435a1343-d272-407f-9329-a6d1f481a22a","Type":"ContainerStarted","Data":"d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792"} Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.607236 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.445919434 podStartE2EDuration="37.607217684s" podCreationTimestamp="2025-12-02 23:14:39 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.703606611 +0000 UTC m=+1022.412160894" lastFinishedPulling="2025-12-02 23:15:15.864904851 +0000 UTC m=+1054.573459144" observedRunningTime="2025-12-02 23:15:16.603333079 +0000 UTC m=+1055.311887362" watchObservedRunningTime="2025-12-02 23:15:16.607217684 +0000 UTC m=+1055.315771967" Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.620399 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-mck9h" podStartSLOduration=3.620382995 podStartE2EDuration="3.620382995s" podCreationTimestamp="2025-12-02 23:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:16.617629588 +0000 UTC m=+1055.326183881" watchObservedRunningTime="2025-12-02 23:15:16.620382995 +0000 UTC m=+1055.328937268" Dec 02 23:15:16 crc kubenswrapper[4903]: I1202 23:15:16.834141 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.606408 4903 generic.go:334] "Generic (PLEG): container finished" podID="c5621cca-33f2-4de9-a39a-aca977548db7" containerID="879949e7e5f4b385f07d8da4b2a584ea45329f99a3477578649daa29cd41890a" exitCode=0 Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.606848 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" event={"ID":"c5621cca-33f2-4de9-a39a-aca977548db7","Type":"ContainerDied","Data":"879949e7e5f4b385f07d8da4b2a584ea45329f99a3477578649daa29cd41890a"} Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.609148 4903 generic.go:334] "Generic (PLEG): container finished" podID="435a1343-d272-407f-9329-a6d1f481a22a" containerID="c21ada723047edb2d29285b3ab0e52418e1b17d40ed45803ee1d7bea6945eaa9" exitCode=0 Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.609225 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mck9h" event={"ID":"435a1343-d272-407f-9329-a6d1f481a22a","Type":"ContainerDied","Data":"c21ada723047edb2d29285b3ab0e52418e1b17d40ed45803ee1d7bea6945eaa9"} Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.612223 4903 generic.go:334] "Generic (PLEG): container finished" podID="ebba9302-4b1e-4073-83f4-505b43e2309c" containerID="bc102bce8b7e6e0204c98f8462ed2631b352ac6bbfab1b9e754bcf308c08885f" exitCode=0 Dec 02 23:15:17 crc kubenswrapper[4903]: I1202 23:15:17.637683 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6742-account-create-update-8snbz" event={"ID":"ebba9302-4b1e-4073-83f4-505b43e2309c","Type":"ContainerDied","Data":"bc102bce8b7e6e0204c98f8462ed2631b352ac6bbfab1b9e754bcf308c08885f"} Dec 02 23:15:18 crc kubenswrapper[4903]: I1202 23:15:18.263625 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:18 crc kubenswrapper[4903]: E1202 23:15:18.263933 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:18 crc kubenswrapper[4903]: E1202 23:15:18.264143 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:18 crc kubenswrapper[4903]: E1202 23:15:18.264280 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:22.264244954 +0000 UTC m=+1060.972799277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:18 crc kubenswrapper[4903]: I1202 23:15:18.623951 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerStarted","Data":"37db9d225f60dc79e872e93e4627d2e3ef7e6cd467c048b8d56b9402afbdeac1"} Dec 02 23:15:19 crc kubenswrapper[4903]: I1202 23:15:19.903254 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 23:15:19 crc kubenswrapper[4903]: I1202 23:15:19.903885 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.158085 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mzmcv"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.159536 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mzmcv"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.159614 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.228299 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-843c-account-create-update-kmlgq"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.229638 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.232492 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.234901 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-843c-account-create-update-kmlgq"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.323043 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.323167 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.323235 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4c2\" (UniqueName: \"kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.323407 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv5s\" (UniqueName: \"kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.425179 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.425293 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.425348 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4c2\" (UniqueName: \"kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.425519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv5s\" (UniqueName: \"kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.426459 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.427589 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.446156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4c2\" (UniqueName: \"kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2\") pod \"keystone-db-create-mzmcv\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.454987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv5s\" (UniqueName: \"kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s\") pod \"keystone-843c-account-create-update-kmlgq\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.521121 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.555154 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.581953 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s22jn"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.583171 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.612133 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s22jn"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.686011 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e73c-account-create-update-twcq7"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.687363 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.691383 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.701724 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e73c-account-create-update-twcq7"] Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.730939 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwwg\" (UniqueName: \"kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.731211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.834038 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hftf\" (UniqueName: \"kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.834160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.834381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwwg\" (UniqueName: \"kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.834426 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.836597 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.864572 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwwg\" (UniqueName: \"kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg\") pod \"placement-db-create-s22jn\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.907378 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s22jn" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.936191 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.936874 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.937454 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hftf\" (UniqueName: \"kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:21 crc kubenswrapper[4903]: I1202 23:15:21.955673 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hftf\" (UniqueName: \"kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf\") pod \"placement-e73c-account-create-update-twcq7\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:22 crc kubenswrapper[4903]: I1202 23:15:22.003822 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:22 crc kubenswrapper[4903]: I1202 23:15:22.344230 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:22 crc kubenswrapper[4903]: E1202 23:15:22.344441 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:22 crc kubenswrapper[4903]: E1202 23:15:22.344669 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:22 crc kubenswrapper[4903]: E1202 23:15:22.344731 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:30.344711174 +0000 UTC m=+1069.053265457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:23 crc kubenswrapper[4903]: I1202 23:15:23.472807 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.680162 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.703594 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-6742-account-create-update-8snbz" event={"ID":"ebba9302-4b1e-4073-83f4-505b43e2309c","Type":"ContainerDied","Data":"939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a"} Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.703637 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="939c203efb5e8160c29687d7767332a8bc4ed19694fb898610e1277c7f5e9e6a" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.703643 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.707835 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mck9h" event={"ID":"435a1343-d272-407f-9329-a6d1f481a22a","Type":"ContainerDied","Data":"d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792"} Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.707855 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12543d232ff0a303dd29f1cb03861d607843fcfbb800a60198d1bb69cc07792" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.707875 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mck9h" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.803968 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts\") pod \"435a1343-d272-407f-9329-a6d1f481a22a\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.804095 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfg7z\" (UniqueName: \"kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z\") pod \"ebba9302-4b1e-4073-83f4-505b43e2309c\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.804118 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fgs8\" (UniqueName: \"kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8\") pod \"435a1343-d272-407f-9329-a6d1f481a22a\" (UID: \"435a1343-d272-407f-9329-a6d1f481a22a\") " Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.804236 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts\") pod \"ebba9302-4b1e-4073-83f4-505b43e2309c\" (UID: \"ebba9302-4b1e-4073-83f4-505b43e2309c\") " Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.805685 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebba9302-4b1e-4073-83f4-505b43e2309c" (UID: "ebba9302-4b1e-4073-83f4-505b43e2309c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.805777 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "435a1343-d272-407f-9329-a6d1f481a22a" (UID: "435a1343-d272-407f-9329-a6d1f481a22a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.810848 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z" (OuterVolumeSpecName: "kube-api-access-xfg7z") pod "ebba9302-4b1e-4073-83f4-505b43e2309c" (UID: "ebba9302-4b1e-4073-83f4-505b43e2309c"). InnerVolumeSpecName "kube-api-access-xfg7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.819846 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8" (OuterVolumeSpecName: "kube-api-access-8fgs8") pod "435a1343-d272-407f-9329-a6d1f481a22a" (UID: "435a1343-d272-407f-9329-a6d1f481a22a"). InnerVolumeSpecName "kube-api-access-8fgs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.907960 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebba9302-4b1e-4073-83f4-505b43e2309c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.908007 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/435a1343-d272-407f-9329-a6d1f481a22a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.908023 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfg7z\" (UniqueName: \"kubernetes.io/projected/ebba9302-4b1e-4073-83f4-505b43e2309c-kube-api-access-xfg7z\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.908040 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fgs8\" (UniqueName: \"kubernetes.io/projected/435a1343-d272-407f-9329-a6d1f481a22a-kube-api-access-8fgs8\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:24 crc kubenswrapper[4903]: I1202 23:15:24.923409 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mzmcv"] Dec 02 23:15:24 crc kubenswrapper[4903]: W1202 23:15:24.928275 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255fbd82_365f_4670_81d4_c173abf6c67b.slice/crio-e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0 WatchSource:0}: Error finding container e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0: Status 404 returned error can't find the container with id e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0 Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.006612 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s22jn"] Dec 02 23:15:25 crc kubenswrapper[4903]: W1202 23:15:25.027777 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3afc5a61_8a6e_46a3_b593_7b26bcfa855e.slice/crio-14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac WatchSource:0}: Error finding container 14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac: Status 404 returned error can't find the container with id 14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.057229 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e73c-account-create-update-twcq7"] Dec 02 23:15:25 crc kubenswrapper[4903]: W1202 23:15:25.065844 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d56e27_883f_492d_bb5a_ddf83ea2c78e.slice/crio-a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f WatchSource:0}: Error finding container a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f: Status 404 returned error can't find the container with id a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.123261 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-843c-account-create-update-kmlgq"] Dec 02 23:15:25 crc kubenswrapper[4903]: W1202 23:15:25.123345 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8653f5e9_f817_4962_a725_5acc5a161f29.slice/crio-c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab WatchSource:0}: Error finding container c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab: Status 404 returned error can't find the container with id c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.719923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-843c-account-create-update-kmlgq" event={"ID":"8653f5e9-f817-4962-a725-5acc5a161f29","Type":"ContainerStarted","Data":"c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab"} Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.722378 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzmcv" event={"ID":"255fbd82-365f-4670-81d4-c173abf6c67b","Type":"ContainerStarted","Data":"e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0"} Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.724610 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e73c-account-create-update-twcq7" event={"ID":"02d56e27-883f-492d-bb5a-ddf83ea2c78e","Type":"ContainerStarted","Data":"a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f"} Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.726074 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-6742-account-create-update-8snbz" Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.726855 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s22jn" event={"ID":"3afc5a61-8a6e-46a3-b593-7b26bcfa855e","Type":"ContainerStarted","Data":"14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac"} Dec 02 23:15:25 crc kubenswrapper[4903]: I1202 23:15:25.893986 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.165547 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:15:26 crc kubenswrapper[4903]: E1202 23:15:26.166164 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebba9302-4b1e-4073-83f4-505b43e2309c" containerName="mariadb-account-create-update" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.166182 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebba9302-4b1e-4073-83f4-505b43e2309c" containerName="mariadb-account-create-update" Dec 02 23:15:26 crc kubenswrapper[4903]: E1202 23:15:26.166206 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435a1343-d272-407f-9329-a6d1f481a22a" containerName="mariadb-database-create" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.166215 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="435a1343-d272-407f-9329-a6d1f481a22a" containerName="mariadb-database-create" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.166412 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="435a1343-d272-407f-9329-a6d1f481a22a" containerName="mariadb-database-create" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.166433 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebba9302-4b1e-4073-83f4-505b43e2309c" containerName="mariadb-account-create-update" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.167278 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.175228 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.175239 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.178545 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.178769 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xxk8r" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.190557 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333143 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-config\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333326 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333615 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333733 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-scripts\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333785 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333818 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76q92\" (UniqueName: \"kubernetes.io/projected/adcf8345-41bb-495c-a006-573f6afe5af9-kube-api-access-76q92\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.333846 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.434833 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.434875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.434957 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-scripts\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.434985 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.435005 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76q92\" (UniqueName: \"kubernetes.io/projected/adcf8345-41bb-495c-a006-573f6afe5af9-kube-api-access-76q92\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.435026 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.435093 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-config\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.435778 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.435995 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-config\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.436255 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcf8345-41bb-495c-a006-573f6afe5af9-scripts\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.441133 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.442216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.442687 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/adcf8345-41bb-495c-a006-573f6afe5af9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.455201 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76q92\" (UniqueName: \"kubernetes.io/projected/adcf8345-41bb-495c-a006-573f6afe5af9-kube-api-access-76q92\") pod \"ovn-northd-0\" (UID: \"adcf8345-41bb-495c-a006-573f6afe5af9\") " pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.545229 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.765490 4903 generic.go:334] "Generic (PLEG): container finished" podID="255fbd82-365f-4670-81d4-c173abf6c67b" containerID="f0f79efabf05b2c52cf1849f53aedfa9f23a87ddab1faf71a8b1e847c3b42862" exitCode=0 Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.765561 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzmcv" event={"ID":"255fbd82-365f-4670-81d4-c173abf6c67b","Type":"ContainerDied","Data":"f0f79efabf05b2c52cf1849f53aedfa9f23a87ddab1faf71a8b1e847c3b42862"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.777334 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" event={"ID":"c5621cca-33f2-4de9-a39a-aca977548db7","Type":"ContainerStarted","Data":"28de3ccf8e190207ab0a7ce8feb947bd1a78d0e8e04db6490ca9cac4bb68094c"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.777694 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.780601 4903 generic.go:334] "Generic (PLEG): container finished" podID="02d56e27-883f-492d-bb5a-ddf83ea2c78e" containerID="f0be7d97cf17faf4250f3dac4cf157bafcbe8291fe459bbb3e3629a37c1f3b80" exitCode=0 Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.780755 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e73c-account-create-update-twcq7" event={"ID":"02d56e27-883f-492d-bb5a-ddf83ea2c78e","Type":"ContainerDied","Data":"f0be7d97cf17faf4250f3dac4cf157bafcbe8291fe459bbb3e3629a37c1f3b80"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.787310 4903 generic.go:334] "Generic (PLEG): container finished" podID="3afc5a61-8a6e-46a3-b593-7b26bcfa855e" containerID="d2545e57af6266debf47fc4fcd1f2dae0a1d473c62ce84334fc9d25f7dd4a4c7" exitCode=0 Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.787359 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s22jn" event={"ID":"3afc5a61-8a6e-46a3-b593-7b26bcfa855e","Type":"ContainerDied","Data":"d2545e57af6266debf47fc4fcd1f2dae0a1d473c62ce84334fc9d25f7dd4a4c7"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.791093 4903 generic.go:334] "Generic (PLEG): container finished" podID="8653f5e9-f817-4962-a725-5acc5a161f29" containerID="2c417e13ece4ddd066f0bbe8eacb021511a8d8f6d51ba126f6f651eb2dbdb85d" exitCode=0 Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.791148 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-843c-account-create-update-kmlgq" event={"ID":"8653f5e9-f817-4962-a725-5acc5a161f29","Type":"ContainerDied","Data":"2c417e13ece4ddd066f0bbe8eacb021511a8d8f6d51ba126f6f651eb2dbdb85d"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.793466 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkjrc" event={"ID":"f16a381a-80d3-4a60-be1b-e782dab1c73c","Type":"ContainerStarted","Data":"fa5d425ded9e1af5f34a7b25c7d8cd2258f1e824b0467a48c778700902386e02"} Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.808736 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" podStartSLOduration=13.808720044 podStartE2EDuration="13.808720044s" podCreationTimestamp="2025-12-02 23:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:26.808103068 +0000 UTC m=+1065.516657351" watchObservedRunningTime="2025-12-02 23:15:26.808720044 +0000 UTC m=+1065.517274327" Dec 02 23:15:26 crc kubenswrapper[4903]: I1202 23:15:26.871687 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gkjrc" podStartSLOduration=1.918605691 podStartE2EDuration="11.87167138s" podCreationTimestamp="2025-12-02 23:15:15 +0000 UTC" firstStartedPulling="2025-12-02 23:15:16.071567854 +0000 UTC m=+1054.780122137" lastFinishedPulling="2025-12-02 23:15:26.024633523 +0000 UTC m=+1064.733187826" observedRunningTime="2025-12-02 23:15:26.864689 +0000 UTC m=+1065.573243283" watchObservedRunningTime="2025-12-02 23:15:26.87167138 +0000 UTC m=+1065.580225663" Dec 02 23:15:27 crc kubenswrapper[4903]: I1202 23:15:27.020640 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 23:15:27 crc kubenswrapper[4903]: W1202 23:15:27.040847 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadcf8345_41bb_495c_a006_573f6afe5af9.slice/crio-712185a26ad5925419b3fad23f03d2fce2c15864c1f1e7ac4fc267068fafbf15 WatchSource:0}: Error finding container 712185a26ad5925419b3fad23f03d2fce2c15864c1f1e7ac4fc267068fafbf15: Status 404 returned error can't find the container with id 712185a26ad5925419b3fad23f03d2fce2c15864c1f1e7ac4fc267068fafbf15 Dec 02 23:15:27 crc kubenswrapper[4903]: I1202 23:15:27.806149 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adcf8345-41bb-495c-a006-573f6afe5af9","Type":"ContainerStarted","Data":"712185a26ad5925419b3fad23f03d2fce2c15864c1f1e7ac4fc267068fafbf15"} Dec 02 23:15:28 crc kubenswrapper[4903]: I1202 23:15:28.816209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-843c-account-create-update-kmlgq" event={"ID":"8653f5e9-f817-4962-a725-5acc5a161f29","Type":"ContainerDied","Data":"c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab"} Dec 02 23:15:28 crc kubenswrapper[4903]: I1202 23:15:28.816533 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c43d0e789d4d2a316902199139594529e8155e06ea2b187778f4140c0cc033ab" Dec 02 23:15:28 crc kubenswrapper[4903]: I1202 23:15:28.819506 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e73c-account-create-update-twcq7" event={"ID":"02d56e27-883f-492d-bb5a-ddf83ea2c78e","Type":"ContainerDied","Data":"a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f"} Dec 02 23:15:28 crc kubenswrapper[4903]: I1202 23:15:28.819558 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c03c70d40fe06d96c633ec07c784900edd5c6655d576057ce9f02af649f61f" Dec 02 23:15:28 crc kubenswrapper[4903]: I1202 23:15:28.958081 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.002839 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.018549 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s22jn" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.025265 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.090784 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts\") pod \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.091073 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hftf\" (UniqueName: \"kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf\") pod \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\" (UID: \"02d56e27-883f-492d-bb5a-ddf83ea2c78e\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.093320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02d56e27-883f-492d-bb5a-ddf83ea2c78e" (UID: "02d56e27-883f-492d-bb5a-ddf83ea2c78e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.097000 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf" (OuterVolumeSpecName: "kube-api-access-8hftf") pod "02d56e27-883f-492d-bb5a-ddf83ea2c78e" (UID: "02d56e27-883f-492d-bb5a-ddf83ea2c78e"). InnerVolumeSpecName "kube-api-access-8hftf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192292 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts\") pod \"8653f5e9-f817-4962-a725-5acc5a161f29\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192379 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpwwg\" (UniqueName: \"kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg\") pod \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192441 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts\") pod \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\" (UID: \"3afc5a61-8a6e-46a3-b593-7b26bcfa855e\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192497 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv5s\" (UniqueName: \"kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s\") pod \"8653f5e9-f817-4962-a725-5acc5a161f29\" (UID: \"8653f5e9-f817-4962-a725-5acc5a161f29\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192745 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts\") pod \"255fbd82-365f-4670-81d4-c173abf6c67b\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.192792 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x4c2\" (UniqueName: \"kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2\") pod \"255fbd82-365f-4670-81d4-c173abf6c67b\" (UID: \"255fbd82-365f-4670-81d4-c173abf6c67b\") " Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193114 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3afc5a61-8a6e-46a3-b593-7b26bcfa855e" (UID: "3afc5a61-8a6e-46a3-b593-7b26bcfa855e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193147 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8653f5e9-f817-4962-a725-5acc5a161f29" (UID: "8653f5e9-f817-4962-a725-5acc5a161f29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193435 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "255fbd82-365f-4670-81d4-c173abf6c67b" (UID: "255fbd82-365f-4670-81d4-c173abf6c67b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193494 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8653f5e9-f817-4962-a725-5acc5a161f29-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193517 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02d56e27-883f-492d-bb5a-ddf83ea2c78e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193529 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.193540 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hftf\" (UniqueName: \"kubernetes.io/projected/02d56e27-883f-492d-bb5a-ddf83ea2c78e-kube-api-access-8hftf\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.200007 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s" (OuterVolumeSpecName: "kube-api-access-brv5s") pod "8653f5e9-f817-4962-a725-5acc5a161f29" (UID: "8653f5e9-f817-4962-a725-5acc5a161f29"). InnerVolumeSpecName "kube-api-access-brv5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.200316 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg" (OuterVolumeSpecName: "kube-api-access-fpwwg") pod "3afc5a61-8a6e-46a3-b593-7b26bcfa855e" (UID: "3afc5a61-8a6e-46a3-b593-7b26bcfa855e"). InnerVolumeSpecName "kube-api-access-fpwwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.200550 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2" (OuterVolumeSpecName: "kube-api-access-2x4c2") pod "255fbd82-365f-4670-81d4-c173abf6c67b" (UID: "255fbd82-365f-4670-81d4-c173abf6c67b"). InnerVolumeSpecName "kube-api-access-2x4c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.295328 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpwwg\" (UniqueName: \"kubernetes.io/projected/3afc5a61-8a6e-46a3-b593-7b26bcfa855e-kube-api-access-fpwwg\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.295397 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv5s\" (UniqueName: \"kubernetes.io/projected/8653f5e9-f817-4962-a725-5acc5a161f29-kube-api-access-brv5s\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.295411 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/255fbd82-365f-4670-81d4-c173abf6c67b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.295448 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x4c2\" (UniqueName: \"kubernetes.io/projected/255fbd82-365f-4670-81d4-c173abf6c67b-kube-api-access-2x4c2\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.829071 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzmcv" event={"ID":"255fbd82-365f-4670-81d4-c173abf6c67b","Type":"ContainerDied","Data":"e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0"} Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.829097 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzmcv" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.829116 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1cd58fe05b6922984515c48e8350c430647e2ab74b2913baeb107ffc75426c0" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.832076 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adcf8345-41bb-495c-a006-573f6afe5af9","Type":"ContainerStarted","Data":"a6e2d8dec4c08d1535911ddca629f3c7f4899dbb79c8e1eda6c46bd318aa0ed2"} Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.832109 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adcf8345-41bb-495c-a006-573f6afe5af9","Type":"ContainerStarted","Data":"d1e7b51fada68a9db631c6062df92f3c721a0355df8c772a5d7b1d51e30b607a"} Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.836914 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s22jn" event={"ID":"3afc5a61-8a6e-46a3-b593-7b26bcfa855e","Type":"ContainerDied","Data":"14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac"} Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.836957 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14115cdcf7318dc5ddcd568c5b8be88f137cd41a4899a249b2c1831bcf37a3ac" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.837047 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s22jn" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.848175 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e73c-account-create-update-twcq7" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.848737 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerStarted","Data":"b60c0804d0b56b47cfb1bf0f283521aa6346766a6e8d5087480304c445ab63cb"} Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.848890 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-843c-account-create-update-kmlgq" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.852845 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.04507307 podStartE2EDuration="3.852827758s" podCreationTimestamp="2025-12-02 23:15:26 +0000 UTC" firstStartedPulling="2025-12-02 23:15:27.047558371 +0000 UTC m=+1065.756112654" lastFinishedPulling="2025-12-02 23:15:28.855313039 +0000 UTC m=+1067.563867342" observedRunningTime="2025-12-02 23:15:29.852279584 +0000 UTC m=+1068.560833887" watchObservedRunningTime="2025-12-02 23:15:29.852827758 +0000 UTC m=+1068.561382041" Dec 02 23:15:29 crc kubenswrapper[4903]: I1202 23:15:29.883736 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.727770361 podStartE2EDuration="56.883716632s" podCreationTimestamp="2025-12-02 23:14:33 +0000 UTC" firstStartedPulling="2025-12-02 23:14:43.708556613 +0000 UTC m=+1022.417110896" lastFinishedPulling="2025-12-02 23:15:28.864502844 +0000 UTC m=+1067.573057167" observedRunningTime="2025-12-02 23:15:29.878144496 +0000 UTC m=+1068.586698789" watchObservedRunningTime="2025-12-02 23:15:29.883716632 +0000 UTC m=+1068.592270915" Dec 02 23:15:30 crc kubenswrapper[4903]: I1202 23:15:30.416265 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:30 crc kubenswrapper[4903]: E1202 23:15:30.416560 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 23:15:30 crc kubenswrapper[4903]: E1202 23:15:30.416614 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 23:15:30 crc kubenswrapper[4903]: E1202 23:15:30.416749 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift podName:55e0eb4b-69b7-4845-84aa-77dae4384f32 nodeName:}" failed. No retries permitted until 2025-12-02 23:15:46.416716767 +0000 UTC m=+1085.125271090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift") pod "swift-storage-0" (UID: "55e0eb4b-69b7-4845-84aa-77dae4384f32") : configmap "swift-ring-files" not found Dec 02 23:15:30 crc kubenswrapper[4903]: I1202 23:15:30.858955 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 23:15:32 crc kubenswrapper[4903]: I1202 23:15:32.608888 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2rx8r" podUID="fc491fc5-9e88-4e1d-9848-ea8846acd82b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.177396 4903 generic.go:334] "Generic (PLEG): container finished" podID="adbb82a2-c30f-4e59-be9c-9274739caf25" containerID="84be4fb0d0225a528b60ed67e1700242f46cd7db3e750ec446115b074ddcb25f" exitCode=0 Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.180393 4903 generic.go:334] "Generic (PLEG): container finished" podID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerID="904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45" exitCode=0 Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.319189 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"adbb82a2-c30f-4e59-be9c-9274739caf25","Type":"ContainerDied","Data":"84be4fb0d0225a528b60ed67e1700242f46cd7db3e750ec446115b074ddcb25f"} Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.319243 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerDied","Data":"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45"} Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.336219 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lkt78" podUID="d72fba58-af32-4b1a-a883-4e76ec6dc3f4" containerName="ovn-controller" probeResult="failure" output=< Dec 02 23:15:33 crc kubenswrapper[4903]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 23:15:33 crc kubenswrapper[4903]: > Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.338269 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.353111 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cs6mk" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741180 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lkt78-config-67rjd"] Dec 02 23:15:33 crc kubenswrapper[4903]: E1202 23:15:33.741502 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afc5a61-8a6e-46a3-b593-7b26bcfa855e" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741513 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afc5a61-8a6e-46a3-b593-7b26bcfa855e" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: E1202 23:15:33.741529 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8653f5e9-f817-4962-a725-5acc5a161f29" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741537 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8653f5e9-f817-4962-a725-5acc5a161f29" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: E1202 23:15:33.741557 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255fbd82-365f-4670-81d4-c173abf6c67b" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741563 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="255fbd82-365f-4670-81d4-c173abf6c67b" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: E1202 23:15:33.741571 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d56e27-883f-492d-bb5a-ddf83ea2c78e" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741577 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d56e27-883f-492d-bb5a-ddf83ea2c78e" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741753 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8653f5e9-f817-4962-a725-5acc5a161f29" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741773 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afc5a61-8a6e-46a3-b593-7b26bcfa855e" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741788 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d56e27-883f-492d-bb5a-ddf83ea2c78e" containerName="mariadb-account-create-update" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.741799 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="255fbd82-365f-4670-81d4-c173abf6c67b" containerName="mariadb-database-create" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.742313 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.745260 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.753530 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78-config-67rjd"] Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.763841 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.822634 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.822927 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="dnsmasq-dns" containerID="cri-o://00e1d76c64567fb8fc1c376750919e2235f0bbf1302278418138cdc1935355e5" gracePeriod=10 Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857419 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857473 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857499 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857569 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857692 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.857776 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mx64\" (UniqueName: \"kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.969878 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970223 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mx64\" (UniqueName: \"kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970252 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970279 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970305 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970371 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970525 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.970620 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.971310 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.973702 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:33 crc kubenswrapper[4903]: I1202 23:15:33.991245 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mx64\" (UniqueName: \"kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64\") pod \"ovn-controller-lkt78-config-67rjd\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.057168 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.197292 4903 generic.go:334] "Generic (PLEG): container finished" podID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerID="00e1d76c64567fb8fc1c376750919e2235f0bbf1302278418138cdc1935355e5" exitCode=0 Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.197362 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" event={"ID":"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3","Type":"ContainerDied","Data":"00e1d76c64567fb8fc1c376750919e2235f0bbf1302278418138cdc1935355e5"} Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.198370 4903 generic.go:334] "Generic (PLEG): container finished" podID="1743f362-cc56-4c25-a31d-7a78f269f570" containerID="474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1" exitCode=0 Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.198424 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerDied","Data":"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1"} Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.201457 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"adbb82a2-c30f-4e59-be9c-9274739caf25","Type":"ContainerStarted","Data":"26e33164ac38c932bf78ebb7472913872f77ba23de6d92272ba0ffe7b173ef97"} Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.201709 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.207413 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerStarted","Data":"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be"} Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.269328 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=56.482313613 podStartE2EDuration="1m8.269313118s" podCreationTimestamp="2025-12-02 23:14:26 +0000 UTC" firstStartedPulling="2025-12-02 23:14:42.981503775 +0000 UTC m=+1021.690058058" lastFinishedPulling="2025-12-02 23:14:54.76850328 +0000 UTC m=+1033.477057563" observedRunningTime="2025-12-02 23:15:34.265988437 +0000 UTC m=+1072.974542740" watchObservedRunningTime="2025-12-02 23:15:34.269313118 +0000 UTC m=+1072.977867401" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.294407 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.578470609 podStartE2EDuration="1m8.294360469s" podCreationTimestamp="2025-12-02 23:14:26 +0000 UTC" firstStartedPulling="2025-12-02 23:14:42.638783962 +0000 UTC m=+1021.347338255" lastFinishedPulling="2025-12-02 23:14:55.354673832 +0000 UTC m=+1034.063228115" observedRunningTime="2025-12-02 23:15:34.28619825 +0000 UTC m=+1072.994752543" watchObservedRunningTime="2025-12-02 23:15:34.294360469 +0000 UTC m=+1073.002914752" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.332340 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.477597 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrx7x\" (UniqueName: \"kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x\") pod \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.477748 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc\") pod \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.477802 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb\") pod \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.477847 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config\") pod \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.477872 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb\") pod \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\" (UID: \"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3\") " Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.497481 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x" (OuterVolumeSpecName: "kube-api-access-hrx7x") pod "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" (UID: "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3"). InnerVolumeSpecName "kube-api-access-hrx7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.515498 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config" (OuterVolumeSpecName: "config") pod "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" (UID: "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.535863 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" (UID: "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.542873 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" (UID: "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.559859 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78-config-67rjd"] Dec 02 23:15:34 crc kubenswrapper[4903]: W1202 23:15:34.562256 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod436c20dd_8382_412b_b817_68dbb4890994.slice/crio-3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110 WatchSource:0}: Error finding container 3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110: Status 404 returned error can't find the container with id 3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110 Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.569982 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" (UID: "e1e1c0de-8117-4f51-8403-c0d56fcd4fa3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.579698 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.579729 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrx7x\" (UniqueName: \"kubernetes.io/projected/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-kube-api-access-hrx7x\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.579743 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.579755 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.579766 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.847159 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.847457 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:34 crc kubenswrapper[4903]: I1202 23:15:34.849355 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.227341 4903 generic.go:334] "Generic (PLEG): container finished" podID="436c20dd-8382-412b-b817-68dbb4890994" containerID="c05fac882c1e44a20dc8568975b81f635728484946b4852a3f4c993c9f60acca" exitCode=0 Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.227416 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-67rjd" event={"ID":"436c20dd-8382-412b-b817-68dbb4890994","Type":"ContainerDied","Data":"c05fac882c1e44a20dc8568975b81f635728484946b4852a3f4c993c9f60acca"} Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.227668 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-67rjd" event={"ID":"436c20dd-8382-412b-b817-68dbb4890994","Type":"ContainerStarted","Data":"3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110"} Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.230575 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" event={"ID":"e1e1c0de-8117-4f51-8403-c0d56fcd4fa3","Type":"ContainerDied","Data":"47b9523324a329b9ff14bc4c5d4d8885b2f2e86c0748a7e354c43bb144b9582c"} Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.230605 4903 scope.go:117] "RemoveContainer" containerID="00e1d76c64567fb8fc1c376750919e2235f0bbf1302278418138cdc1935355e5" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.230746 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85cb4fb747-l8slh" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.240012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerStarted","Data":"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63"} Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.240752 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.249406 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.254197 4903 scope.go:117] "RemoveContainer" containerID="9be57891e0d9105f6536799654cb3b4764356f1de9f8732ac0815651929436f6" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.286522 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.902238126 podStartE2EDuration="1m9.286499037s" podCreationTimestamp="2025-12-02 23:14:26 +0000 UTC" firstStartedPulling="2025-12-02 23:14:42.970438402 +0000 UTC m=+1021.678992685" lastFinishedPulling="2025-12-02 23:14:55.354699273 +0000 UTC m=+1034.063253596" observedRunningTime="2025-12-02 23:15:35.280250814 +0000 UTC m=+1073.988805107" watchObservedRunningTime="2025-12-02 23:15:35.286499037 +0000 UTC m=+1073.995053320" Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.350428 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.366759 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85cb4fb747-l8slh"] Dec 02 23:15:35 crc kubenswrapper[4903]: I1202 23:15:35.640494 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" path="/var/lib/kubelet/pods/e1e1c0de-8117-4f51-8403-c0d56fcd4fa3/volumes" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.249762 4903 generic.go:334] "Generic (PLEG): container finished" podID="f16a381a-80d3-4a60-be1b-e782dab1c73c" containerID="fa5d425ded9e1af5f34a7b25c7d8cd2258f1e824b0467a48c778700902386e02" exitCode=0 Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.249854 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkjrc" event={"ID":"f16a381a-80d3-4a60-be1b-e782dab1c73c","Type":"ContainerDied","Data":"fa5d425ded9e1af5f34a7b25c7d8cd2258f1e824b0467a48c778700902386e02"} Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.603972 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719152 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719235 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719300 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719319 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719360 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mx64\" (UniqueName: \"kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719358 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run" (OuterVolumeSpecName: "var-run") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719382 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719383 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts\") pod \"436c20dd-8382-412b-b817-68dbb4890994\" (UID: \"436c20dd-8382-412b-b817-68dbb4890994\") " Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719456 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.719981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.720397 4903 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.720444 4903 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.720461 4903 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.720481 4903 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/436c20dd-8382-412b-b817-68dbb4890994-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.720784 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts" (OuterVolumeSpecName: "scripts") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.740836 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64" (OuterVolumeSpecName: "kube-api-access-6mx64") pod "436c20dd-8382-412b-b817-68dbb4890994" (UID: "436c20dd-8382-412b-b817-68dbb4890994"). InnerVolumeSpecName "kube-api-access-6mx64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.822377 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/436c20dd-8382-412b-b817-68dbb4890994-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:36 crc kubenswrapper[4903]: I1202 23:15:36.822407 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mx64\" (UniqueName: \"kubernetes.io/projected/436c20dd-8382-412b-b817-68dbb4890994-kube-api-access-6mx64\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.214603 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lkt78" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.265426 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-67rjd" event={"ID":"436c20dd-8382-412b-b817-68dbb4890994","Type":"ContainerDied","Data":"3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110"} Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.265473 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3adc817f85644e47b10f50c5bad3a14c9e4973d557f52d602aab2b98124c4110" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.265489 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-67rjd" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.575582 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.635878 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.635945 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.636043 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.636081 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.636138 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm8h6\" (UniqueName: \"kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.636156 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.636242 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts\") pod \"f16a381a-80d3-4a60-be1b-e782dab1c73c\" (UID: \"f16a381a-80d3-4a60-be1b-e782dab1c73c\") " Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.642168 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.642485 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.643800 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6" (OuterVolumeSpecName: "kube-api-access-qm8h6") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "kube-api-access-qm8h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.656544 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.666822 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.700345 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.702370 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts" (OuterVolumeSpecName: "scripts") pod "f16a381a-80d3-4a60-be1b-e782dab1c73c" (UID: "f16a381a-80d3-4a60-be1b-e782dab1c73c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.713073 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.724103 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lkt78-config-67rjd"] Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.741406 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lkt78-config-67rjd"] Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743347 4903 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743384 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743397 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm8h6\" (UniqueName: \"kubernetes.io/projected/f16a381a-80d3-4a60-be1b-e782dab1c73c-kube-api-access-qm8h6\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743411 4903 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743422 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f16a381a-80d3-4a60-be1b-e782dab1c73c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743434 4903 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f16a381a-80d3-4a60-be1b-e782dab1c73c-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.743444 4903 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f16a381a-80d3-4a60-be1b-e782dab1c73c-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.749763 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821189 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lkt78-config-kwrpt"] Dec 02 23:15:37 crc kubenswrapper[4903]: E1202 23:15:37.821530 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16a381a-80d3-4a60-be1b-e782dab1c73c" containerName="swift-ring-rebalance" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821547 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16a381a-80d3-4a60-be1b-e782dab1c73c" containerName="swift-ring-rebalance" Dec 02 23:15:37 crc kubenswrapper[4903]: E1202 23:15:37.821564 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436c20dd-8382-412b-b817-68dbb4890994" containerName="ovn-config" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821570 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="436c20dd-8382-412b-b817-68dbb4890994" containerName="ovn-config" Dec 02 23:15:37 crc kubenswrapper[4903]: E1202 23:15:37.821589 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="init" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821596 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="init" Dec 02 23:15:37 crc kubenswrapper[4903]: E1202 23:15:37.821608 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="dnsmasq-dns" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821616 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="dnsmasq-dns" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821770 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e1c0de-8117-4f51-8403-c0d56fcd4fa3" containerName="dnsmasq-dns" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821792 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="436c20dd-8382-412b-b817-68dbb4890994" containerName="ovn-config" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.821804 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16a381a-80d3-4a60-be1b-e782dab1c73c" containerName="swift-ring-rebalance" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.822303 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.830942 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.862462 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78-config-kwrpt"] Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.868960 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.869013 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8w6\" (UniqueName: \"kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.869052 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.869110 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.869144 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.869198 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.970958 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971016 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971057 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971120 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971158 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv8w6\" (UniqueName: \"kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971194 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971467 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971548 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.971594 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.972348 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.973013 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:37 crc kubenswrapper[4903]: I1202 23:15:37.990882 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv8w6\" (UniqueName: \"kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6\") pod \"ovn-controller-lkt78-config-kwrpt\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.138104 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283166 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="prometheus" containerID="cri-o://549f03176e1c9cd6246023d31eea53beaf5978f428985bb4496afa494a359b03" gracePeriod=600 Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283538 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkjrc" Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283834 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkjrc" event={"ID":"f16a381a-80d3-4a60-be1b-e782dab1c73c","Type":"ContainerDied","Data":"550f6559425bd2c4b558c9afde4b64c775f0b6f2da774377d44699972d7af21c"} Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283844 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="thanos-sidecar" containerID="cri-o://b60c0804d0b56b47cfb1bf0f283521aa6346766a6e8d5087480304c445ab63cb" gracePeriod=600 Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283908 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550f6559425bd2c4b558c9afde4b64c775f0b6f2da774377d44699972d7af21c" Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.283970 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="config-reloader" containerID="cri-o://37db9d225f60dc79e872e93e4627d2e3ef7e6cd467c048b8d56b9402afbdeac1" gracePeriod=600 Dec 02 23:15:38 crc kubenswrapper[4903]: I1202 23:15:38.597534 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lkt78-config-kwrpt"] Dec 02 23:15:38 crc kubenswrapper[4903]: W1202 23:15:38.608135 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137750c8_0fef_43a4_9a51_3798e2fa8b53.slice/crio-c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed WatchSource:0}: Error finding container c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed: Status 404 returned error can't find the container with id c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.293797 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerID="b60c0804d0b56b47cfb1bf0f283521aa6346766a6e8d5087480304c445ab63cb" exitCode=0 Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294107 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerID="37db9d225f60dc79e872e93e4627d2e3ef7e6cd467c048b8d56b9402afbdeac1" exitCode=0 Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294117 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerID="549f03176e1c9cd6246023d31eea53beaf5978f428985bb4496afa494a359b03" exitCode=0 Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.293867 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerDied","Data":"b60c0804d0b56b47cfb1bf0f283521aa6346766a6e8d5087480304c445ab63cb"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294172 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerDied","Data":"37db9d225f60dc79e872e93e4627d2e3ef7e6cd467c048b8d56b9402afbdeac1"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerDied","Data":"549f03176e1c9cd6246023d31eea53beaf5978f428985bb4496afa494a359b03"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294228 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf8363b9-18a6-48d7-993e-703ceccc7291","Type":"ContainerDied","Data":"45de5075c0cb1935854cf147cab7f1ec730b37d308b1da78c7c5cc58ee5a1d65"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.294246 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45de5075c0cb1935854cf147cab7f1ec730b37d308b1da78c7c5cc58ee5a1d65" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.295953 4903 generic.go:334] "Generic (PLEG): container finished" podID="137750c8-0fef-43a4-9a51-3798e2fa8b53" containerID="505f1f8a030d5d6e67eda39f677de2f9d0b7674e20f4223fe6103b4dd6833d1b" exitCode=0 Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.296004 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-kwrpt" event={"ID":"137750c8-0fef-43a4-9a51-3798e2fa8b53","Type":"ContainerDied","Data":"505f1f8a030d5d6e67eda39f677de2f9d0b7674e20f4223fe6103b4dd6833d1b"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.296035 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-kwrpt" event={"ID":"137750c8-0fef-43a4-9a51-3798e2fa8b53","Type":"ContainerStarted","Data":"c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed"} Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.321329 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395293 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395517 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395551 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395721 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtgzp\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395760 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395786 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395829 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.395866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets\") pod \"cf8363b9-18a6-48d7-993e-703ceccc7291\" (UID: \"cf8363b9-18a6-48d7-993e-703ceccc7291\") " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.405144 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.405335 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.416841 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config" (OuterVolumeSpecName: "config") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.422854 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out" (OuterVolumeSpecName: "config-out") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.423845 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp" (OuterVolumeSpecName: "kube-api-access-vtgzp") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "kube-api-access-vtgzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.426846 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.460895 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config" (OuterVolumeSpecName: "web-config") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499607 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtgzp\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-kube-api-access-vtgzp\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499636 4903 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499646 4903 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf8363b9-18a6-48d7-993e-703ceccc7291-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499666 4903 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf8363b9-18a6-48d7-993e-703ceccc7291-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499675 4903 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf8363b9-18a6-48d7-993e-703ceccc7291-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499686 4903 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.499695 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf8363b9-18a6-48d7-993e-703ceccc7291-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.595203 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cf8363b9-18a6-48d7-993e-703ceccc7291" (UID: "cf8363b9-18a6-48d7-993e-703ceccc7291"). InnerVolumeSpecName "pvc-791bde7a-5990-4917-baab-d6fca61a913e". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.601269 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") on node \"crc\" " Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.619587 4903 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.619776 4903 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-791bde7a-5990-4917-baab-d6fca61a913e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e") on node "crc" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.623348 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436c20dd-8382-412b-b817-68dbb4890994" path="/var/lib/kubelet/pods/436c20dd-8382-412b-b817-68dbb4890994/volumes" Dec 02 23:15:39 crc kubenswrapper[4903]: I1202 23:15:39.702343 4903 reconciler_common.go:293] "Volume detached for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.305830 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.331217 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.338391 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.363276 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:40 crc kubenswrapper[4903]: E1202 23:15:40.363731 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="init-config-reloader" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.363757 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="init-config-reloader" Dec 02 23:15:40 crc kubenswrapper[4903]: E1202 23:15:40.363771 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="prometheus" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.363782 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="prometheus" Dec 02 23:15:40 crc kubenswrapper[4903]: E1202 23:15:40.363813 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="thanos-sidecar" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.363822 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="thanos-sidecar" Dec 02 23:15:40 crc kubenswrapper[4903]: E1202 23:15:40.363833 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="config-reloader" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.363842 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="config-reloader" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.364038 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="thanos-sidecar" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.364052 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="config-reloader" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.364070 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" containerName="prometheus" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.365949 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.370378 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.372826 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.372862 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.372937 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.373038 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5p4fs" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.380697 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.383963 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.393174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414575 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414646 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414730 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414756 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414848 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414881 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414921 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414958 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.414981 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5qq\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.415022 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.415051 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516326 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5qq\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516385 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516415 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516462 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516505 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516587 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516614 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516712 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516744 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516774 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.516808 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.523978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.525476 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.526228 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.530001 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.530600 4903 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.530637 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/389fc60ed9b89584c09faa75d07c0667b0d3839786e48ead64fa3957a7dc98cb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.532614 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.533165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.537442 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.538506 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.539317 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.546772 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5qq\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.589554 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.685504 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.704274 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.733993 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.734102 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.734164 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.734232 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv8w6\" (UniqueName: \"kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.734376 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.734418 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn\") pod \"137750c8-0fef-43a4-9a51-3798e2fa8b53\" (UID: \"137750c8-0fef-43a4-9a51-3798e2fa8b53\") " Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.735489 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.738551 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts" (OuterVolumeSpecName: "scripts") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.738640 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.738687 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run" (OuterVolumeSpecName: "var-run") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.742091 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.746264 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6" (OuterVolumeSpecName: "kube-api-access-jv8w6") pod "137750c8-0fef-43a4-9a51-3798e2fa8b53" (UID: "137750c8-0fef-43a4-9a51-3798e2fa8b53"). InnerVolumeSpecName "kube-api-access-jv8w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836311 4903 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836343 4903 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836353 4903 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836361 4903 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/137750c8-0fef-43a4-9a51-3798e2fa8b53-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836369 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/137750c8-0fef-43a4-9a51-3798e2fa8b53-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:40 crc kubenswrapper[4903]: I1202 23:15:40.836378 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv8w6\" (UniqueName: \"kubernetes.io/projected/137750c8-0fef-43a4-9a51-3798e2fa8b53-kube-api-access-jv8w6\") on node \"crc\" DevicePath \"\"" Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.174385 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:15:41 crc kubenswrapper[4903]: W1202 23:15:41.185594 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cbfa4bd_d10d_4cd7_9208_fe8e1af2b545.slice/crio-5805e491e5e822df3dbaafd155c8ca1195f35a6c04a0983274b30fcf7ab033a0 WatchSource:0}: Error finding container 5805e491e5e822df3dbaafd155c8ca1195f35a6c04a0983274b30fcf7ab033a0: Status 404 returned error can't find the container with id 5805e491e5e822df3dbaafd155c8ca1195f35a6c04a0983274b30fcf7ab033a0 Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.316779 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lkt78-config-kwrpt" event={"ID":"137750c8-0fef-43a4-9a51-3798e2fa8b53","Type":"ContainerDied","Data":"c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed"} Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.316832 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3470066ae4f4548f95a0249c687ad027bb2eb05738e0f02b5370c350da5f4ed" Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.316797 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lkt78-config-kwrpt" Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.318337 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerStarted","Data":"5805e491e5e822df3dbaafd155c8ca1195f35a6c04a0983274b30fcf7ab033a0"} Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.625516 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf8363b9-18a6-48d7-993e-703ceccc7291" path="/var/lib/kubelet/pods/cf8363b9-18a6-48d7-993e-703ceccc7291/volumes" Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.657976 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.791508 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lkt78-config-kwrpt"] Dec 02 23:15:41 crc kubenswrapper[4903]: I1202 23:15:41.796901 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lkt78-config-kwrpt"] Dec 02 23:15:43 crc kubenswrapper[4903]: I1202 23:15:43.624139 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137750c8-0fef-43a4-9a51-3798e2fa8b53" path="/var/lib/kubelet/pods/137750c8-0fef-43a4-9a51-3798e2fa8b53/volumes" Dec 02 23:15:44 crc kubenswrapper[4903]: I1202 23:15:44.351060 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerStarted","Data":"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba"} Dec 02 23:15:46 crc kubenswrapper[4903]: I1202 23:15:46.455525 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:46 crc kubenswrapper[4903]: I1202 23:15:46.465154 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/55e0eb4b-69b7-4845-84aa-77dae4384f32-etc-swift\") pod \"swift-storage-0\" (UID: \"55e0eb4b-69b7-4845-84aa-77dae4384f32\") " pod="openstack/swift-storage-0" Dec 02 23:15:46 crc kubenswrapper[4903]: I1202 23:15:46.674003 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 23:15:47 crc kubenswrapper[4903]: I1202 23:15:47.243502 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 23:15:47 crc kubenswrapper[4903]: W1202 23:15:47.253220 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e0eb4b_69b7_4845_84aa_77dae4384f32.slice/crio-e53ebea675aa00d104bdfa835e2e56b65506b830da8e6c2ccd4d99f83a702d16 WatchSource:0}: Error finding container e53ebea675aa00d104bdfa835e2e56b65506b830da8e6c2ccd4d99f83a702d16: Status 404 returned error can't find the container with id e53ebea675aa00d104bdfa835e2e56b65506b830da8e6c2ccd4d99f83a702d16 Dec 02 23:15:47 crc kubenswrapper[4903]: I1202 23:15:47.374699 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"e53ebea675aa00d104bdfa835e2e56b65506b830da8e6c2ccd4d99f83a702d16"} Dec 02 23:15:47 crc kubenswrapper[4903]: I1202 23:15:47.504585 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 02 23:15:47 crc kubenswrapper[4903]: I1202 23:15:47.714917 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Dec 02 23:15:48 crc kubenswrapper[4903]: I1202 23:15:48.101191 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="adbb82a2-c30f-4e59-be9c-9274739caf25" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Dec 02 23:15:48 crc kubenswrapper[4903]: I1202 23:15:48.385391 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"e0922b621b2ea03c4c00f8ae1da92784dd164189e6f90a949642416ad9e80cfe"} Dec 02 23:15:49 crc kubenswrapper[4903]: I1202 23:15:49.398741 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"dba41ec41b27780409c998384b6903994070152a030607d3c93771d9aa2a7a0b"} Dec 02 23:15:49 crc kubenswrapper[4903]: I1202 23:15:49.399178 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"3985360c26ecba4f5156f93a8818ad3e8fa3ca564c8d59b82be2097798496a51"} Dec 02 23:15:49 crc kubenswrapper[4903]: I1202 23:15:49.399199 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"44776100d66696411e744510d5a91dc6fb86d0d1e7b955776c029c42b57fbe70"} Dec 02 23:15:50 crc kubenswrapper[4903]: I1202 23:15:50.418431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"b45f1e62fa3f2b2e040c482e582e17ca4bf6efdb574a99897a745abd85a54362"} Dec 02 23:15:50 crc kubenswrapper[4903]: I1202 23:15:50.418494 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"499f80692d1b367877fbc799eac64341c414c004f765a9aae19b86b343bbd070"} Dec 02 23:15:50 crc kubenswrapper[4903]: I1202 23:15:50.418513 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"66024a84a6be55ed0dc9bddce148e63b59ec7d3cfd7d3c6458b0070375590516"} Dec 02 23:15:51 crc kubenswrapper[4903]: I1202 23:15:51.433804 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"d6ebaf9e35c772829709800d15914e29e99bea80440f318c6e5e421cb39e0014"} Dec 02 23:15:51 crc kubenswrapper[4903]: I1202 23:15:51.434165 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"d3b3187aa80e1e460ee15aea6f1b0c9f8a5b604332678dbacf25c340bb3a055d"} Dec 02 23:15:51 crc kubenswrapper[4903]: I1202 23:15:51.434179 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"98f9987ec4fff629d5722d9a8259807fe0b2cce825bd4d5b4b121941a231417b"} Dec 02 23:15:51 crc kubenswrapper[4903]: I1202 23:15:51.434190 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"8cb4bee7a73b9170873546203d846aa3ac1f3968d41a288e243f832c1bfe194b"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.449744 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"901df30c7f6267b6f813a3b5e188dcd053b16f9cf3419bca3eaab9d82a8f1950"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.449796 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"551be643f62faac1c44862b9b21644ba8fdb1d4a65b440694be6d931c46fd41b"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.449807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"c79cf3954a446a5cbaa3f9cf858db75d7b84ed6c573aabf4a1dcedc2a0f625a1"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.449818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"55e0eb4b-69b7-4845-84aa-77dae4384f32","Type":"ContainerStarted","Data":"739e21ca4b733916b6c4919324c15db6dd0340c0d8084911b3db01b31ca316fc"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.451800 4903 generic.go:334] "Generic (PLEG): container finished" podID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerID="b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba" exitCode=0 Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.451855 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerDied","Data":"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba"} Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.493018 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.87724449 podStartE2EDuration="39.492996393s" podCreationTimestamp="2025-12-02 23:15:13 +0000 UTC" firstStartedPulling="2025-12-02 23:15:47.257315335 +0000 UTC m=+1085.965869638" lastFinishedPulling="2025-12-02 23:15:50.873067258 +0000 UTC m=+1089.581621541" observedRunningTime="2025-12-02 23:15:52.491466796 +0000 UTC m=+1091.200021119" watchObservedRunningTime="2025-12-02 23:15:52.492996393 +0000 UTC m=+1091.201550686" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.792525 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:15:52 crc kubenswrapper[4903]: E1202 23:15:52.793184 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137750c8-0fef-43a4-9a51-3798e2fa8b53" containerName="ovn-config" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.793219 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="137750c8-0fef-43a4-9a51-3798e2fa8b53" containerName="ovn-config" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.793576 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="137750c8-0fef-43a4-9a51-3798e2fa8b53" containerName="ovn-config" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.795172 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.803839 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.824828 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.967765 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.967838 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.967916 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.968032 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.968063 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:52 crc kubenswrapper[4903]: I1202 23:15:52.968090 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwndx\" (UniqueName: \"kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.069555 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.069762 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.069800 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.069832 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwndx\" (UniqueName: \"kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.069893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070220 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070306 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070791 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070881 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070989 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.070391 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.071325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.071619 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.097290 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwndx\" (UniqueName: \"kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx\") pod \"dnsmasq-dns-7d7cc4fbd9-qf9dk\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.138123 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.461930 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerStarted","Data":"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe"} Dec 02 23:15:53 crc kubenswrapper[4903]: I1202 23:15:53.589785 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:15:53 crc kubenswrapper[4903]: W1202 23:15:53.595340 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f7e4993_0dd2_413f_bd55_631ee71946c4.slice/crio-c799b5ccd4ac2748e403f7dc781e36656dce5f6d86361e434d5d1c58910683d3 WatchSource:0}: Error finding container c799b5ccd4ac2748e403f7dc781e36656dce5f6d86361e434d5d1c58910683d3: Status 404 returned error can't find the container with id c799b5ccd4ac2748e403f7dc781e36656dce5f6d86361e434d5d1c58910683d3 Dec 02 23:15:54 crc kubenswrapper[4903]: I1202 23:15:54.475325 4903 generic.go:334] "Generic (PLEG): container finished" podID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerID="bb45ac70d3a4787c1f5ead0fa22243cc6dde0eb12bd8b6194ca868905447a0bd" exitCode=0 Dec 02 23:15:54 crc kubenswrapper[4903]: I1202 23:15:54.475459 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" event={"ID":"8f7e4993-0dd2-413f-bd55-631ee71946c4","Type":"ContainerDied","Data":"bb45ac70d3a4787c1f5ead0fa22243cc6dde0eb12bd8b6194ca868905447a0bd"} Dec 02 23:15:54 crc kubenswrapper[4903]: I1202 23:15:54.475699 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" event={"ID":"8f7e4993-0dd2-413f-bd55-631ee71946c4","Type":"ContainerStarted","Data":"c799b5ccd4ac2748e403f7dc781e36656dce5f6d86361e434d5d1c58910683d3"} Dec 02 23:15:55 crc kubenswrapper[4903]: I1202 23:15:55.487353 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" event={"ID":"8f7e4993-0dd2-413f-bd55-631ee71946c4","Type":"ContainerStarted","Data":"5415f53c54ee851d0c151c81707ee43d45c23b497ae231fcc5eb477aa9a61b99"} Dec 02 23:15:55 crc kubenswrapper[4903]: I1202 23:15:55.488463 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:15:55 crc kubenswrapper[4903]: I1202 23:15:55.516892 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podStartSLOduration=3.516866903 podStartE2EDuration="3.516866903s" podCreationTimestamp="2025-12-02 23:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:55.51265006 +0000 UTC m=+1094.221204383" watchObservedRunningTime="2025-12-02 23:15:55.516866903 +0000 UTC m=+1094.225421226" Dec 02 23:15:56 crc kubenswrapper[4903]: I1202 23:15:56.499343 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerStarted","Data":"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef"} Dec 02 23:15:56 crc kubenswrapper[4903]: I1202 23:15:56.499660 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerStarted","Data":"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446"} Dec 02 23:15:56 crc kubenswrapper[4903]: I1202 23:15:56.533677 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.533637603 podStartE2EDuration="16.533637603s" podCreationTimestamp="2025-12-02 23:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:56.527729158 +0000 UTC m=+1095.236283441" watchObservedRunningTime="2025-12-02 23:15:56.533637603 +0000 UTC m=+1095.242191886" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.503926 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.714910 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.913869 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5jpq8"] Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.914949 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.931356 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5jpq8"] Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.956566 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c12e-account-create-update-jtpzh"] Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.957867 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.959836 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 23:15:57 crc kubenswrapper[4903]: I1202 23:15:57.965558 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c12e-account-create-update-jtpzh"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.012850 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j87j\" (UniqueName: \"kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.012997 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.103883 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.131626 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gxh6v"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.133386 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150158 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150317 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd2f4\" (UniqueName: \"kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150439 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6ks\" (UniqueName: \"kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150538 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j87j\" (UniqueName: \"kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.150571 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.155033 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.155756 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gxh6v"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.190162 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j87j\" (UniqueName: \"kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j\") pod \"barbican-db-create-5jpq8\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.218271 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9c87-account-create-update-n69dm"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.219789 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.226172 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.229251 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5jpq8" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.251858 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6ks\" (UniqueName: \"kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.252158 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.252293 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.252427 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd2f4\" (UniqueName: \"kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.253398 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.253629 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.269095 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9c87-account-create-update-n69dm"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.269987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6ks\" (UniqueName: \"kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks\") pod \"cinder-db-create-gxh6v\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.274979 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd2f4\" (UniqueName: \"kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4\") pod \"barbican-c12e-account-create-update-jtpzh\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.290025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.302813 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-45qgr"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.309775 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.317027 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bzv6f" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.317644 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.317780 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.317877 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.327341 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45qgr"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.353764 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgp5r\" (UniqueName: \"kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.353995 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.458668 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4h4x\" (UniqueName: \"kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.459014 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.459114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgp5r\" (UniqueName: \"kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.459146 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.459174 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.459873 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.464578 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxh6v" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.486743 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgp5r\" (UniqueName: \"kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r\") pod \"cinder-9c87-account-create-update-n69dm\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.560383 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.560705 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4h4x\" (UniqueName: \"kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.560810 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.571335 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.589381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.592807 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4h4x\" (UniqueName: \"kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x\") pod \"keystone-db-sync-45qgr\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.653275 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.666252 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45qgr" Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.893869 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c12e-account-create-update-jtpzh"] Dec 02 23:15:58 crc kubenswrapper[4903]: I1202 23:15:58.949884 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5jpq8"] Dec 02 23:15:58 crc kubenswrapper[4903]: W1202 23:15:58.954470 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9bc1f3f_1d84_4d5f_9e8f_bfec26bcd965.slice/crio-00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7 WatchSource:0}: Error finding container 00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7: Status 404 returned error can't find the container with id 00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7 Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.042795 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gxh6v"] Dec 02 23:15:59 crc kubenswrapper[4903]: W1202 23:15:59.045760 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod769c6602_40ba_4f02_8f65_47ea4be08be4.slice/crio-2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb WatchSource:0}: Error finding container 2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb: Status 404 returned error can't find the container with id 2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.186880 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9c87-account-create-update-n69dm"] Dec 02 23:15:59 crc kubenswrapper[4903]: W1202 23:15:59.188005 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f1bc13_509f_4bcb_85bd_3af265b8ef01.slice/crio-96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839 WatchSource:0}: Error finding container 96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839: Status 404 returned error can't find the container with id 96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839 Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.267433 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-45qgr"] Dec 02 23:15:59 crc kubenswrapper[4903]: W1202 23:15:59.290005 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ba384c_5066_4a2d_a1d6_dbb7090b32c4.slice/crio-957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc WatchSource:0}: Error finding container 957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc: Status 404 returned error can't find the container with id 957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.544681 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45qgr" event={"ID":"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4","Type":"ContainerStarted","Data":"957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.545677 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxh6v" event={"ID":"769c6602-40ba-4f02-8f65-47ea4be08be4","Type":"ContainerStarted","Data":"dfea6a6a7493885dee015f1130f08f7899b8736900a063101bd2c4227798af70"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.545701 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxh6v" event={"ID":"769c6602-40ba-4f02-8f65-47ea4be08be4","Type":"ContainerStarted","Data":"2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.558456 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b051f79-dd17-4446-8316-8de5216d958f" containerID="dbaaa350be87bf9a255c33156d8f6c761507fdc3f77a2883b6a3beb0d68c9e47" exitCode=0 Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.558543 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c12e-account-create-update-jtpzh" event={"ID":"1b051f79-dd17-4446-8316-8de5216d958f","Type":"ContainerDied","Data":"dbaaa350be87bf9a255c33156d8f6c761507fdc3f77a2883b6a3beb0d68c9e47"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.558833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c12e-account-create-update-jtpzh" event={"ID":"1b051f79-dd17-4446-8316-8de5216d958f","Type":"ContainerStarted","Data":"a88658a081a1460374a57076bde93a14a8086624257b5d1c77cad3fa93958f11"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.562007 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-gxh6v" podStartSLOduration=1.561988612 podStartE2EDuration="1.561988612s" podCreationTimestamp="2025-12-02 23:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:59.55986123 +0000 UTC m=+1098.268415513" watchObservedRunningTime="2025-12-02 23:15:59.561988612 +0000 UTC m=+1098.270542895" Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.564505 4903 generic.go:334] "Generic (PLEG): container finished" podID="d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" containerID="e290219c57410296cd68e48c28aca458c961ccb2c70ae4ac7b1796c8298f6661" exitCode=0 Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.564568 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5jpq8" event={"ID":"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965","Type":"ContainerDied","Data":"e290219c57410296cd68e48c28aca458c961ccb2c70ae4ac7b1796c8298f6661"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.564593 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5jpq8" event={"ID":"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965","Type":"ContainerStarted","Data":"00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.565981 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9c87-account-create-update-n69dm" event={"ID":"15f1bc13-509f-4bcb-85bd-3af265b8ef01","Type":"ContainerStarted","Data":"bfd7b651f5f338bcab8b4712e40a72ea822356a4cb1e17d5a849d804c0837109"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.566020 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9c87-account-create-update-n69dm" event={"ID":"15f1bc13-509f-4bcb-85bd-3af265b8ef01","Type":"ContainerStarted","Data":"96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839"} Dec 02 23:15:59 crc kubenswrapper[4903]: I1202 23:15:59.587914 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9c87-account-create-update-n69dm" podStartSLOduration=1.587898284 podStartE2EDuration="1.587898284s" podCreationTimestamp="2025-12-02 23:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:15:59.585387353 +0000 UTC m=+1098.293941636" watchObservedRunningTime="2025-12-02 23:15:59.587898284 +0000 UTC m=+1098.296452567" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.439770 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mhrvb"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.440834 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.452098 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mhrvb"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.504547 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-8sbng"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.505626 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.511909 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.512448 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-7mmjm" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.515249 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8sbng"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.562245 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4eb0-account-create-update-rhh8z"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.563400 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.567268 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.581680 4903 generic.go:334] "Generic (PLEG): container finished" podID="15f1bc13-509f-4bcb-85bd-3af265b8ef01" containerID="bfd7b651f5f338bcab8b4712e40a72ea822356a4cb1e17d5a849d804c0837109" exitCode=0 Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.581735 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9c87-account-create-update-n69dm" event={"ID":"15f1bc13-509f-4bcb-85bd-3af265b8ef01","Type":"ContainerDied","Data":"bfd7b651f5f338bcab8b4712e40a72ea822356a4cb1e17d5a849d804c0837109"} Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.583423 4903 generic.go:334] "Generic (PLEG): container finished" podID="769c6602-40ba-4f02-8f65-47ea4be08be4" containerID="dfea6a6a7493885dee015f1130f08f7899b8736900a063101bd2c4227798af70" exitCode=0 Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.583500 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxh6v" event={"ID":"769c6602-40ba-4f02-8f65-47ea4be08be4","Type":"ContainerDied","Data":"dfea6a6a7493885dee015f1130f08f7899b8736900a063101bd2c4227798af70"} Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.587181 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4eb0-account-create-update-rhh8z"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.603755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.603894 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.603929 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xndt\" (UniqueName: \"kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.603959 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndgw\" (UniqueName: \"kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.604001 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.604111 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.687187 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.705777 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.705840 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.705901 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.705988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.706031 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xndt\" (UniqueName: \"kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.706053 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dldc\" (UniqueName: \"kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.706098 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndgw\" (UniqueName: \"kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.706141 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.706609 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.713354 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.714744 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.716503 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.722919 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndgw\" (UniqueName: \"kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw\") pod \"watcher-db-sync-8sbng\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.723090 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xndt\" (UniqueName: \"kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt\") pod \"glance-db-create-mhrvb\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.755255 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-trbs9"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.757094 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.764387 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.767732 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trbs9"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.787549 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4154-account-create-update-6zrtz"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.789281 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.793419 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.799501 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4154-account-create-update-6zrtz"] Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.808127 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dldc\" (UniqueName: \"kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.808302 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.814596 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.835161 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dldc\" (UniqueName: \"kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc\") pod \"glance-4eb0-account-create-update-rhh8z\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.836378 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.885210 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.910097 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.910171 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.910210 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg46\" (UniqueName: \"kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.910260 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76pjs\" (UniqueName: \"kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:00 crc kubenswrapper[4903]: I1202 23:16:00.966401 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.012609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.012715 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.012755 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg46\" (UniqueName: \"kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.012822 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76pjs\" (UniqueName: \"kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.013324 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.013357 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.039532 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5jpq8" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.042833 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg46\" (UniqueName: \"kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46\") pod \"neutron-4154-account-create-update-6zrtz\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.042857 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76pjs\" (UniqueName: \"kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs\") pod \"neutron-db-create-trbs9\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.114307 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts\") pod \"1b051f79-dd17-4446-8316-8de5216d958f\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.114420 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd2f4\" (UniqueName: \"kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4\") pod \"1b051f79-dd17-4446-8316-8de5216d958f\" (UID: \"1b051f79-dd17-4446-8316-8de5216d958f\") " Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.116047 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b051f79-dd17-4446-8316-8de5216d958f" (UID: "1b051f79-dd17-4446-8316-8de5216d958f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.119103 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4" (OuterVolumeSpecName: "kube-api-access-qd2f4") pod "1b051f79-dd17-4446-8316-8de5216d958f" (UID: "1b051f79-dd17-4446-8316-8de5216d958f"). InnerVolumeSpecName "kube-api-access-qd2f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.216086 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j87j\" (UniqueName: \"kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j\") pod \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.216240 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts\") pod \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\" (UID: \"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965\") " Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.216753 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b051f79-dd17-4446-8316-8de5216d958f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.216769 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd2f4\" (UniqueName: \"kubernetes.io/projected/1b051f79-dd17-4446-8316-8de5216d958f-kube-api-access-qd2f4\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.217135 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" (UID: "d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.220760 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j" (OuterVolumeSpecName: "kube-api-access-6j87j") pod "d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" (UID: "d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965"). InnerVolumeSpecName "kube-api-access-6j87j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.243426 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.253617 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.318361 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j87j\" (UniqueName: \"kubernetes.io/projected/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-kube-api-access-6j87j\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.318394 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.366438 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mhrvb"] Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.462017 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4eb0-account-create-update-rhh8z"] Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.487847 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8sbng"] Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.606621 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c12e-account-create-update-jtpzh" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.607039 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c12e-account-create-update-jtpzh" event={"ID":"1b051f79-dd17-4446-8316-8de5216d958f","Type":"ContainerDied","Data":"a88658a081a1460374a57076bde93a14a8086624257b5d1c77cad3fa93958f11"} Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.607080 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88658a081a1460374a57076bde93a14a8086624257b5d1c77cad3fa93958f11" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.634256 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5jpq8" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.647202 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5jpq8" event={"ID":"d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965","Type":"ContainerDied","Data":"00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7"} Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.647242 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00280cf515b1ceb20186b1bf0e984e3428c38854aad22e00074078b39f5cbbe7" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.647252 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8sbng" event={"ID":"25b38561-c9d6-4223-85cd-c4516718cc5f","Type":"ContainerStarted","Data":"3845b29bd6645b03a3c7cdb06bed509bed9bbdebbb0632d4b64acb7248fec083"} Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.655808 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4eb0-account-create-update-rhh8z" event={"ID":"b9158c65-e3fe-4db4-9329-790edac952f1","Type":"ContainerStarted","Data":"ddd047b7e00a24eaada7f57aae5233f407932c2f13fb8670e9ca80bb4a532033"} Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.681842 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhrvb" event={"ID":"b405c566-af8d-4196-ad1a-5a0dcc450e81","Type":"ContainerStarted","Data":"74783db417036793d473a5892803827b20a49ed26d92bb6d0ea94fc4339f6102"} Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.821203 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trbs9"] Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.823161 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-mhrvb" podStartSLOduration=1.823144062 podStartE2EDuration="1.823144062s" podCreationTimestamp="2025-12-02 23:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:01.810061304 +0000 UTC m=+1100.518615587" watchObservedRunningTime="2025-12-02 23:16:01.823144062 +0000 UTC m=+1100.531698345" Dec 02 23:16:01 crc kubenswrapper[4903]: I1202 23:16:01.911594 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4154-account-create-update-6zrtz"] Dec 02 23:16:02 crc kubenswrapper[4903]: I1202 23:16:02.700622 4903 generic.go:334] "Generic (PLEG): container finished" podID="b9158c65-e3fe-4db4-9329-790edac952f1" containerID="2775ec20d7c40c5afb4676e825260170ba46746f893cda55d798477539d26ddd" exitCode=0 Dec 02 23:16:02 crc kubenswrapper[4903]: I1202 23:16:02.700837 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4eb0-account-create-update-rhh8z" event={"ID":"b9158c65-e3fe-4db4-9329-790edac952f1","Type":"ContainerDied","Data":"2775ec20d7c40c5afb4676e825260170ba46746f893cda55d798477539d26ddd"} Dec 02 23:16:02 crc kubenswrapper[4903]: I1202 23:16:02.705557 4903 generic.go:334] "Generic (PLEG): container finished" podID="b405c566-af8d-4196-ad1a-5a0dcc450e81" containerID="bb0bfd6db04574e89201628c4c0ee4c20689dcd215f6e9727884504600318436" exitCode=0 Dec 02 23:16:02 crc kubenswrapper[4903]: I1202 23:16:02.705599 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhrvb" event={"ID":"b405c566-af8d-4196-ad1a-5a0dcc450e81","Type":"ContainerDied","Data":"bb0bfd6db04574e89201628c4c0ee4c20689dcd215f6e9727884504600318436"} Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.140888 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.201320 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.201578 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="dnsmasq-dns" containerID="cri-o://28de3ccf8e190207ab0a7ce8feb947bd1a78d0e8e04db6490ca9cac4bb68094c" gracePeriod=10 Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.716331 4903 generic.go:334] "Generic (PLEG): container finished" podID="c5621cca-33f2-4de9-a39a-aca977548db7" containerID="28de3ccf8e190207ab0a7ce8feb947bd1a78d0e8e04db6490ca9cac4bb68094c" exitCode=0 Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.716431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" event={"ID":"c5621cca-33f2-4de9-a39a-aca977548db7","Type":"ContainerDied","Data":"28de3ccf8e190207ab0a7ce8feb947bd1a78d0e8e04db6490ca9cac4bb68094c"} Dec 02 23:16:03 crc kubenswrapper[4903]: I1202 23:16:03.762607 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.729919 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9c87-account-create-update-n69dm" event={"ID":"15f1bc13-509f-4bcb-85bd-3af265b8ef01","Type":"ContainerDied","Data":"96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.730175 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e6525c1ed0fba200006d39c59134eeffa706ef87f3c4ebaeaea23020bf9839" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.737885 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4154-account-create-update-6zrtz" event={"ID":"3ca7a411-286a-4af2-bb00-fda2b3323698","Type":"ContainerStarted","Data":"2b161154cc907db88841d400118f3e8b3f8f8e9a5adf573b9f5ce2c898d1d5a5"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.739722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4eb0-account-create-update-rhh8z" event={"ID":"b9158c65-e3fe-4db4-9329-790edac952f1","Type":"ContainerDied","Data":"ddd047b7e00a24eaada7f57aae5233f407932c2f13fb8670e9ca80bb4a532033"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.739753 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd047b7e00a24eaada7f57aae5233f407932c2f13fb8670e9ca80bb4a532033" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.742723 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhrvb" event={"ID":"b405c566-af8d-4196-ad1a-5a0dcc450e81","Type":"ContainerDied","Data":"74783db417036793d473a5892803827b20a49ed26d92bb6d0ea94fc4339f6102"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.742767 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74783db417036793d473a5892803827b20a49ed26d92bb6d0ea94fc4339f6102" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.744319 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxh6v" event={"ID":"769c6602-40ba-4f02-8f65-47ea4be08be4","Type":"ContainerDied","Data":"2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.744369 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f2e67066616ffef4a286a854a525bf944c90d85ee0f65e687dac897025b9ffb" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.745597 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trbs9" event={"ID":"56e790b6-6e19-400a-a329-be2fd76c9e8f","Type":"ContainerStarted","Data":"6e32803df34827efbeac8cd7ad5958cbcfbaa8c1064b1eb77c30917e05e2ff24"} Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.907698 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.936965 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxh6v" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.944113 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.958887 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:04 crc kubenswrapper[4903]: I1202 23:16:04.966775 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.002694 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgp5r\" (UniqueName: \"kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r\") pod \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.002782 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts\") pod \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\" (UID: \"15f1bc13-509f-4bcb-85bd-3af265b8ef01\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.003782 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15f1bc13-509f-4bcb-85bd-3af265b8ef01" (UID: "15f1bc13-509f-4bcb-85bd-3af265b8ef01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.008236 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r" (OuterVolumeSpecName: "kube-api-access-bgp5r") pod "15f1bc13-509f-4bcb-85bd-3af265b8ef01" (UID: "15f1bc13-509f-4bcb-85bd-3af265b8ef01"). InnerVolumeSpecName "kube-api-access-bgp5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105120 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb\") pod \"c5621cca-33f2-4de9-a39a-aca977548db7\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105198 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgs5q\" (UniqueName: \"kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q\") pod \"c5621cca-33f2-4de9-a39a-aca977548db7\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105240 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dldc\" (UniqueName: \"kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc\") pod \"b9158c65-e3fe-4db4-9329-790edac952f1\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105270 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts\") pod \"b9158c65-e3fe-4db4-9329-790edac952f1\" (UID: \"b9158c65-e3fe-4db4-9329-790edac952f1\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105324 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config\") pod \"c5621cca-33f2-4de9-a39a-aca977548db7\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105394 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts\") pod \"b405c566-af8d-4196-ad1a-5a0dcc450e81\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105416 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk6ks\" (UniqueName: \"kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks\") pod \"769c6602-40ba-4f02-8f65-47ea4be08be4\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105458 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb\") pod \"c5621cca-33f2-4de9-a39a-aca977548db7\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105474 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc\") pod \"c5621cca-33f2-4de9-a39a-aca977548db7\" (UID: \"c5621cca-33f2-4de9-a39a-aca977548db7\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105494 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xndt\" (UniqueName: \"kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt\") pod \"b405c566-af8d-4196-ad1a-5a0dcc450e81\" (UID: \"b405c566-af8d-4196-ad1a-5a0dcc450e81\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105524 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts\") pod \"769c6602-40ba-4f02-8f65-47ea4be08be4\" (UID: \"769c6602-40ba-4f02-8f65-47ea4be08be4\") " Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105850 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgp5r\" (UniqueName: \"kubernetes.io/projected/15f1bc13-509f-4bcb-85bd-3af265b8ef01-kube-api-access-bgp5r\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.105868 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f1bc13-509f-4bcb-85bd-3af265b8ef01-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.106420 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "769c6602-40ba-4f02-8f65-47ea4be08be4" (UID: "769c6602-40ba-4f02-8f65-47ea4be08be4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.106458 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9158c65-e3fe-4db4-9329-790edac952f1" (UID: "b9158c65-e3fe-4db4-9329-790edac952f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.106772 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b405c566-af8d-4196-ad1a-5a0dcc450e81" (UID: "b405c566-af8d-4196-ad1a-5a0dcc450e81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.109843 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc" (OuterVolumeSpecName: "kube-api-access-8dldc") pod "b9158c65-e3fe-4db4-9329-790edac952f1" (UID: "b9158c65-e3fe-4db4-9329-790edac952f1"). InnerVolumeSpecName "kube-api-access-8dldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.110310 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q" (OuterVolumeSpecName: "kube-api-access-lgs5q") pod "c5621cca-33f2-4de9-a39a-aca977548db7" (UID: "c5621cca-33f2-4de9-a39a-aca977548db7"). InnerVolumeSpecName "kube-api-access-lgs5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.113697 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks" (OuterVolumeSpecName: "kube-api-access-qk6ks") pod "769c6602-40ba-4f02-8f65-47ea4be08be4" (UID: "769c6602-40ba-4f02-8f65-47ea4be08be4"). InnerVolumeSpecName "kube-api-access-qk6ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.113803 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt" (OuterVolumeSpecName: "kube-api-access-9xndt") pod "b405c566-af8d-4196-ad1a-5a0dcc450e81" (UID: "b405c566-af8d-4196-ad1a-5a0dcc450e81"). InnerVolumeSpecName "kube-api-access-9xndt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.146997 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5621cca-33f2-4de9-a39a-aca977548db7" (UID: "c5621cca-33f2-4de9-a39a-aca977548db7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.149631 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5621cca-33f2-4de9-a39a-aca977548db7" (UID: "c5621cca-33f2-4de9-a39a-aca977548db7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.158632 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config" (OuterVolumeSpecName: "config") pod "c5621cca-33f2-4de9-a39a-aca977548db7" (UID: "c5621cca-33f2-4de9-a39a-aca977548db7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.160090 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5621cca-33f2-4de9-a39a-aca977548db7" (UID: "c5621cca-33f2-4de9-a39a-aca977548db7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.207825 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b405c566-af8d-4196-ad1a-5a0dcc450e81-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.207925 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk6ks\" (UniqueName: \"kubernetes.io/projected/769c6602-40ba-4f02-8f65-47ea4be08be4-kube-api-access-qk6ks\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.207959 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.207986 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208009 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xndt\" (UniqueName: \"kubernetes.io/projected/b405c566-af8d-4196-ad1a-5a0dcc450e81-kube-api-access-9xndt\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208031 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/769c6602-40ba-4f02-8f65-47ea4be08be4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208053 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208075 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgs5q\" (UniqueName: \"kubernetes.io/projected/c5621cca-33f2-4de9-a39a-aca977548db7-kube-api-access-lgs5q\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208101 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dldc\" (UniqueName: \"kubernetes.io/projected/b9158c65-e3fe-4db4-9329-790edac952f1-kube-api-access-8dldc\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208125 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9158c65-e3fe-4db4-9329-790edac952f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.208144 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5621cca-33f2-4de9-a39a-aca977548db7-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.759253 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45qgr" event={"ID":"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4","Type":"ContainerStarted","Data":"f728e0e9e11a13ca8bc676a82f9feb3be8e39805d37ae2c6c6c0a51d3004b3bd"} Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.765123 4903 generic.go:334] "Generic (PLEG): container finished" podID="56e790b6-6e19-400a-a329-be2fd76c9e8f" containerID="bf1e237745787a5004a76ca5374a049393eb696d258fc3c4bcc72c251cd8d9f6" exitCode=0 Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.765284 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trbs9" event={"ID":"56e790b6-6e19-400a-a329-be2fd76c9e8f","Type":"ContainerDied","Data":"bf1e237745787a5004a76ca5374a049393eb696d258fc3c4bcc72c251cd8d9f6"} Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.767557 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ca7a411-286a-4af2-bb00-fda2b3323698" containerID="14caea3a8661d1cfc5c08135390951b40fb5db2ea7986fa05939c640753ede9e" exitCode=0 Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.767606 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4154-account-create-update-6zrtz" event={"ID":"3ca7a411-286a-4af2-bb00-fda2b3323698","Type":"ContainerDied","Data":"14caea3a8661d1cfc5c08135390951b40fb5db2ea7986fa05939c640753ede9e"} Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.769704 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9c87-account-create-update-n69dm" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.770489 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.770954 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554c7689cc-ngxvh" event={"ID":"c5621cca-33f2-4de9-a39a-aca977548db7","Type":"ContainerDied","Data":"e9485965e3f291a5d5ae1098dc0a7b4838a8eca96dd519510c954336946dd51c"} Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.770991 4903 scope.go:117] "RemoveContainer" containerID="28de3ccf8e190207ab0a7ce8feb947bd1a78d0e8e04db6490ca9cac4bb68094c" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.771098 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxh6v" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.771511 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhrvb" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.772092 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4eb0-account-create-update-rhh8z" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.811011 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-45qgr" podStartSLOduration=2.3879044990000002 podStartE2EDuration="7.810989337s" podCreationTimestamp="2025-12-02 23:15:58 +0000 UTC" firstStartedPulling="2025-12-02 23:15:59.292480666 +0000 UTC m=+1098.001034959" lastFinishedPulling="2025-12-02 23:16:04.715565514 +0000 UTC m=+1103.424119797" observedRunningTime="2025-12-02 23:16:05.782112212 +0000 UTC m=+1104.490666495" watchObservedRunningTime="2025-12-02 23:16:05.810989337 +0000 UTC m=+1104.519543620" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.820916 4903 scope.go:117] "RemoveContainer" containerID="879949e7e5f4b385f07d8da4b2a584ea45329f99a3477578649daa29cd41890a" Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.884948 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:16:05 crc kubenswrapper[4903]: I1202 23:16:05.899947 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554c7689cc-ngxvh"] Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.234903 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.247451 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.363266 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76pjs\" (UniqueName: \"kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs\") pod \"56e790b6-6e19-400a-a329-be2fd76c9e8f\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.363339 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgg46\" (UniqueName: \"kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46\") pod \"3ca7a411-286a-4af2-bb00-fda2b3323698\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.363505 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts\") pod \"3ca7a411-286a-4af2-bb00-fda2b3323698\" (UID: \"3ca7a411-286a-4af2-bb00-fda2b3323698\") " Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.363545 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts\") pod \"56e790b6-6e19-400a-a329-be2fd76c9e8f\" (UID: \"56e790b6-6e19-400a-a329-be2fd76c9e8f\") " Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.364264 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56e790b6-6e19-400a-a329-be2fd76c9e8f" (UID: "56e790b6-6e19-400a-a329-be2fd76c9e8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.364659 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ca7a411-286a-4af2-bb00-fda2b3323698" (UID: "3ca7a411-286a-4af2-bb00-fda2b3323698"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.372849 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs" (OuterVolumeSpecName: "kube-api-access-76pjs") pod "56e790b6-6e19-400a-a329-be2fd76c9e8f" (UID: "56e790b6-6e19-400a-a329-be2fd76c9e8f"). InnerVolumeSpecName "kube-api-access-76pjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.377281 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46" (OuterVolumeSpecName: "kube-api-access-fgg46") pod "3ca7a411-286a-4af2-bb00-fda2b3323698" (UID: "3ca7a411-286a-4af2-bb00-fda2b3323698"). InnerVolumeSpecName "kube-api-access-fgg46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.465327 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca7a411-286a-4af2-bb00-fda2b3323698-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.465355 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56e790b6-6e19-400a-a329-be2fd76c9e8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.465364 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76pjs\" (UniqueName: \"kubernetes.io/projected/56e790b6-6e19-400a-a329-be2fd76c9e8f-kube-api-access-76pjs\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.465373 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgg46\" (UniqueName: \"kubernetes.io/projected/3ca7a411-286a-4af2-bb00-fda2b3323698-kube-api-access-fgg46\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.622857 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" path="/var/lib/kubelet/pods/c5621cca-33f2-4de9-a39a-aca977548db7/volumes" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.793395 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4154-account-create-update-6zrtz" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.793550 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4154-account-create-update-6zrtz" event={"ID":"3ca7a411-286a-4af2-bb00-fda2b3323698","Type":"ContainerDied","Data":"2b161154cc907db88841d400118f3e8b3f8f8e9a5adf573b9f5ce2c898d1d5a5"} Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.793576 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b161154cc907db88841d400118f3e8b3f8f8e9a5adf573b9f5ce2c898d1d5a5" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.795770 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trbs9" event={"ID":"56e790b6-6e19-400a-a329-be2fd76c9e8f","Type":"ContainerDied","Data":"6e32803df34827efbeac8cd7ad5958cbcfbaa8c1064b1eb77c30917e05e2ff24"} Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.795790 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e32803df34827efbeac8cd7ad5958cbcfbaa8c1064b1eb77c30917e05e2ff24" Dec 02 23:16:07 crc kubenswrapper[4903]: I1202 23:16:07.795902 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trbs9" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.689255 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.701952 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.791837 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-flhtr"] Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792163 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b051f79-dd17-4446-8316-8de5216d958f" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792181 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b051f79-dd17-4446-8316-8de5216d958f" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792205 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769c6602-40ba-4f02-8f65-47ea4be08be4" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792212 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="769c6602-40ba-4f02-8f65-47ea4be08be4" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792221 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b405c566-af8d-4196-ad1a-5a0dcc450e81" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792227 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b405c566-af8d-4196-ad1a-5a0dcc450e81" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792237 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="init" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792242 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="init" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792257 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f1bc13-509f-4bcb-85bd-3af265b8ef01" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792264 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f1bc13-509f-4bcb-85bd-3af265b8ef01" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792276 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca7a411-286a-4af2-bb00-fda2b3323698" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792282 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca7a411-286a-4af2-bb00-fda2b3323698" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792291 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792297 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792305 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e790b6-6e19-400a-a329-be2fd76c9e8f" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792311 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e790b6-6e19-400a-a329-be2fd76c9e8f" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792321 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="dnsmasq-dns" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792327 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="dnsmasq-dns" Dec 02 23:16:10 crc kubenswrapper[4903]: E1202 23:16:10.792340 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9158c65-e3fe-4db4-9329-790edac952f1" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792347 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9158c65-e3fe-4db4-9329-790edac952f1" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792486 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b405c566-af8d-4196-ad1a-5a0dcc450e81" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792498 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b051f79-dd17-4446-8316-8de5216d958f" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792507 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="769c6602-40ba-4f02-8f65-47ea4be08be4" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792517 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f1bc13-509f-4bcb-85bd-3af265b8ef01" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792527 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5621cca-33f2-4de9-a39a-aca977548db7" containerName="dnsmasq-dns" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792538 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792553 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca7a411-286a-4af2-bb00-fda2b3323698" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792566 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e790b6-6e19-400a-a329-be2fd76c9e8f" containerName="mariadb-database-create" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.792579 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9158c65-e3fe-4db4-9329-790edac952f1" containerName="mariadb-account-create-update" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.794620 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.797081 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h4s5l" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.798305 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.804794 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-flhtr"] Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.845935 4903 generic.go:334] "Generic (PLEG): container finished" podID="e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" containerID="f728e0e9e11a13ca8bc676a82f9feb3be8e39805d37ae2c6c6c0a51d3004b3bd" exitCode=0 Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.846012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45qgr" event={"ID":"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4","Type":"ContainerDied","Data":"f728e0e9e11a13ca8bc676a82f9feb3be8e39805d37ae2c6c6c0a51d3004b3bd"} Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.850277 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.942442 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.942537 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mclrt\" (UniqueName: \"kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.942735 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:10 crc kubenswrapper[4903]: I1202 23:16:10.943122 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.044491 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.044547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mclrt\" (UniqueName: \"kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.044641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.044692 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.050759 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.051037 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.056141 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.059078 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mclrt\" (UniqueName: \"kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt\") pod \"glance-db-sync-flhtr\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:11 crc kubenswrapper[4903]: I1202 23:16:11.120388 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-flhtr" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.238332 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45qgr" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.407034 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle\") pod \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.407249 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data\") pod \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.407368 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4h4x\" (UniqueName: \"kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x\") pod \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\" (UID: \"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4\") " Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.414302 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x" (OuterVolumeSpecName: "kube-api-access-s4h4x") pod "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" (UID: "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4"). InnerVolumeSpecName "kube-api-access-s4h4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.462326 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" (UID: "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.468808 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data" (OuterVolumeSpecName: "config-data") pod "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" (UID: "e4ba384c-5066-4a2d-a1d6-dbb7090b32c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.510356 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.510412 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.510436 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4h4x\" (UniqueName: \"kubernetes.io/projected/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4-kube-api-access-s4h4x\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.600614 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-flhtr"] Dec 02 23:16:14 crc kubenswrapper[4903]: W1202 23:16:14.609037 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fdf589e_17a2_4b53_b68c_f90e884b0080.slice/crio-d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95 WatchSource:0}: Error finding container d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95: Status 404 returned error can't find the container with id d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95 Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.887835 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-45qgr" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.887824 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-45qgr" event={"ID":"e4ba384c-5066-4a2d-a1d6-dbb7090b32c4","Type":"ContainerDied","Data":"957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc"} Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.888002 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957cfed7a099f18fcefe52f7969899b1bd44e5d14b6a5428f98e343176b88dbc" Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.889279 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-flhtr" event={"ID":"5fdf589e-17a2-4b53-b68c-f90e884b0080","Type":"ContainerStarted","Data":"d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95"} Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.890464 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8sbng" event={"ID":"25b38561-c9d6-4223-85cd-c4516718cc5f","Type":"ContainerStarted","Data":"db1bfd3257b5f1e05d3b208cdffd9b7e1e712fd70ba4924a81d8649ed5c89b20"} Dec 02 23:16:14 crc kubenswrapper[4903]: I1202 23:16:14.910192 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-8sbng" podStartSLOduration=2.319285603 podStartE2EDuration="14.910172393s" podCreationTimestamp="2025-12-02 23:16:00 +0000 UTC" firstStartedPulling="2025-12-02 23:16:01.497467207 +0000 UTC m=+1100.206021490" lastFinishedPulling="2025-12-02 23:16:14.088353967 +0000 UTC m=+1112.796908280" observedRunningTime="2025-12-02 23:16:14.903728883 +0000 UTC m=+1113.612283176" watchObservedRunningTime="2025-12-02 23:16:14.910172393 +0000 UTC m=+1113.618726686" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.491718 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:15 crc kubenswrapper[4903]: E1202 23:16:15.492214 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" containerName="keystone-db-sync" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.492228 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" containerName="keystone-db-sync" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.492382 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" containerName="keystone-db-sync" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.493404 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.516427 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.555578 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t42nb"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.557045 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.558812 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bzv6f" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.560135 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.560308 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.560498 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.560559 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.589425 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t42nb"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.641938 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.641991 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642010 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642048 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642074 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642102 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642124 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcds\" (UniqueName: \"kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642140 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642176 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642227 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642253 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.642284 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgrq5\" (UniqueName: \"kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.654484 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.656096 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.658315 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.658527 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pmckg" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.658660 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.658879 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.678639 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744631 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744727 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgrq5\" (UniqueName: \"kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744758 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744789 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744815 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744831 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744849 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744885 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744910 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744935 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.744981 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcds\" (UniqueName: \"kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.745002 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.745029 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.745062 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.745080 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggqm\" (UniqueName: \"kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.745102 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.746075 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.750723 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.752367 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.755725 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.755788 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.756792 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.761946 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.765783 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.777319 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.777531 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.777618 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.782495 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w9hq8"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.784136 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.789495 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.789590 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-46qz6" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.789726 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.789942 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.795502 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcds\" (UniqueName: \"kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.804562 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.810309 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.811481 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys\") pod \"keystone-bootstrap-t42nb\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.825726 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nsw82"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.826926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.846854 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pdxtg" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.847167 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.847431 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.850993 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.851040 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggqm\" (UniqueName: \"kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.852182 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.852263 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.852400 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.853103 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.853922 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w9hq8"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.858241 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgrq5\" (UniqueName: \"kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5\") pod \"dnsmasq-dns-569d967d67-xpnn8\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.858919 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.868349 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.877424 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.896660 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggqm\" (UniqueName: \"kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm\") pod \"horizon-6844b8f8f-qb7dk\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.898163 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nsw82"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.907245 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.956738 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957033 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957101 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957408 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957539 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nmt\" (UniqueName: \"kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957631 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctbjx\" (UniqueName: \"kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957805 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957867 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.957937 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.958003 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.958086 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxcn\" (UniqueName: \"kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.958156 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.983773 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.985212 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:15 crc kubenswrapper[4903]: I1202 23:16:15.985268 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.030870 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.055920 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tv6h2"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.060869 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064399 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxcn\" (UniqueName: \"kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064461 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064524 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064546 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064564 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064591 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064635 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064670 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nmt\" (UniqueName: \"kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064872 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064898 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctbjx\" (UniqueName: \"kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064975 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.064996 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.065024 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.065056 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.065097 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f94dq" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.069962 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.070456 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.070834 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.076335 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.082254 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.090486 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.098601 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.098816 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tv6h2"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.107066 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nmt\" (UniqueName: \"kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.108937 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.109471 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.112629 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.115200 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxcn\" (UniqueName: \"kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn\") pod \"neutron-db-sync-w9hq8\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.117692 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.121305 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.122381 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.122419 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.123838 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctbjx\" (UniqueName: \"kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx\") pod \"cinder-db-sync-nsw82\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.132736 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data\") pod \"ceilometer-0\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.163845 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.165982 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.170594 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.170809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chsbq\" (UniqueName: \"kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.170913 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.171027 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbg4\" (UniqueName: \"kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.191195 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.191409 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.191530 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.191677 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.237873 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bbljr"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.238978 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.253871 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.282846 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.284052 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9m6gd" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.287822 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.289762 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bbljr"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299308 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299364 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299381 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299409 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299438 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299469 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299507 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299577 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l5hf\" (UniqueName: \"kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299645 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chsbq\" (UniqueName: \"kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299726 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299753 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbg4\" (UniqueName: \"kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.299790 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.301607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.302765 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.302915 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.304908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.308567 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.309708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.316933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.325208 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.330010 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chsbq\" (UniqueName: \"kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq\") pod \"horizon-c5c474cdf-n98r7\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.330936 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nsw82" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.338571 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbg4\" (UniqueName: \"kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4\") pod \"barbican-db-sync-tv6h2\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.397136 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.399772 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.400908 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.400968 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l5hf\" (UniqueName: \"kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.400999 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401039 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401062 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401126 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401174 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401238 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401254 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401356 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.401391 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.403112 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.403889 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.405553 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.406025 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.410210 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.425447 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l5hf\" (UniqueName: \"kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf\") pod \"dnsmasq-dns-6db6798dff-kcjng\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.503246 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.503297 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.503340 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.503825 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.503876 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.506735 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.509154 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.509726 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.513301 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.528320 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55\") pod \"placement-db-sync-bbljr\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.670966 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.713761 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t42nb"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.714258 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.728364 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bbljr" Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.755174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.994794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" event={"ID":"22c0b774-da33-4e2c-b263-78439094167c","Type":"ContainerStarted","Data":"cd465cdbd0418d01c74edd454b9c84061ae09dcef1dcf63b16b0c9b72fb5598a"} Dec 02 23:16:16 crc kubenswrapper[4903]: I1202 23:16:16.996516 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6844b8f8f-qb7dk" event={"ID":"1b67e826-a529-4e61-af56-58f80fdc251c","Type":"ContainerStarted","Data":"786d7714453437e5026249abd6262e19b9507f861b797465d68e5a38f91f7024"} Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:16.999053 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.000495 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t42nb" event={"ID":"3ab27c8a-19cb-461d-8055-71c04f10e553","Type":"ContainerStarted","Data":"89810c7f6f4530772e21850699dfa285bb6ca47563fe755b568baf1cec7cff97"} Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.139675 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w9hq8"] Dec 02 23:16:17 crc kubenswrapper[4903]: W1202 23:16:17.140855 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f91dd36_7138_43f1_8091_cca7410520cf.slice/crio-70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5 WatchSource:0}: Error finding container 70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5: Status 404 returned error can't find the container with id 70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5 Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.160011 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nsw82"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.175485 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.185627 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tv6h2"] Dec 02 23:16:17 crc kubenswrapper[4903]: W1202 23:16:17.204935 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdb8e0b_8b04_4dc5_b532_8e68e8206122.slice/crio-a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f WatchSource:0}: Error finding container a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f: Status 404 returned error can't find the container with id a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f Dec 02 23:16:17 crc kubenswrapper[4903]: W1202 23:16:17.212989 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98900f75_26e7_46cb_a70e_537fa0486fe8.slice/crio-513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59 WatchSource:0}: Error finding container 513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59: Status 404 returned error can't find the container with id 513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59 Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.436291 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.446161 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bbljr"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.914767 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.945717 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.962825 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.964405 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:17 crc kubenswrapper[4903]: I1202 23:16:17.977755 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.013639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerStarted","Data":"af95a279857dfebc01610e9c9a62a286492fe617a703f7e026a40e70bee45b01"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.019312 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv6h2" event={"ID":"98900f75-26e7-46cb-a70e-537fa0486fe8","Type":"ContainerStarted","Data":"513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.021832 4903 generic.go:334] "Generic (PLEG): container finished" podID="22c0b774-da33-4e2c-b263-78439094167c" containerID="cda2c99dd5391edc36218fdcaea212e68d0e9d66405e093da36ecfafc90246da" exitCode=0 Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.021961 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" event={"ID":"22c0b774-da33-4e2c-b263-78439094167c","Type":"ContainerDied","Data":"cda2c99dd5391edc36218fdcaea212e68d0e9d66405e093da36ecfafc90246da"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.072643 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t42nb" event={"ID":"3ab27c8a-19cb-461d-8055-71c04f10e553","Type":"ContainerStarted","Data":"1e1fe13234c90a5746f503e83c23d95610a319b063ef9da865233ab6b18b3b04"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.081458 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerStarted","Data":"d35771fc60f5cbd68b6e15f21b8427c2dee9c78810c8e67bbee142c6e56f8e3b"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.087345 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bbljr" event={"ID":"92c0a321-b591-49c3-a9b4-bc6b8bf30820","Type":"ContainerStarted","Data":"86cd0e4d6b3695596de0101f2e57bf7d57771941aa457939ca0834bb52223f8a"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.094457 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.094732 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.094812 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.095008 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.095141 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prw5t\" (UniqueName: \"kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.097642 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nsw82" event={"ID":"ecdb8e0b-8b04-4dc5-b532-8e68e8206122","Type":"ContainerStarted","Data":"a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.104521 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t42nb" podStartSLOduration=3.104503143 podStartE2EDuration="3.104503143s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:18.096755882 +0000 UTC m=+1116.805310165" watchObservedRunningTime="2025-12-02 23:16:18.104503143 +0000 UTC m=+1116.813057426" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.114190 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerID="d966108640f063665f16e98af21a0ac94b1fa731a7be38a133582d35df604846" exitCode=0 Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.114352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" event={"ID":"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba","Type":"ContainerDied","Data":"d966108640f063665f16e98af21a0ac94b1fa731a7be38a133582d35df604846"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.114383 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" event={"ID":"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba","Type":"ContainerStarted","Data":"21e7d076b1c427c9541c42a90a9bb983a7fd5fe98c6f86a4ca14f257a105b87b"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.163092 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hq8" event={"ID":"9f91dd36-7138-43f1-8091-cca7410520cf","Type":"ContainerStarted","Data":"b1e2d008ad53752077e45a6af27982f4a516161fb2dd9d442aa386d308ca3f1d"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.163591 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hq8" event={"ID":"9f91dd36-7138-43f1-8091-cca7410520cf","Type":"ContainerStarted","Data":"70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5"} Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.200428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.200501 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.200520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.200549 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.200575 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prw5t\" (UniqueName: \"kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.201123 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.202025 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.203191 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.224293 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.227289 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prw5t\" (UniqueName: \"kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t\") pod \"horizon-55dff86b95-mnlkl\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.298926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.423349 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.448545 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w9hq8" podStartSLOduration=3.448526868 podStartE2EDuration="3.448526868s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:18.229866944 +0000 UTC m=+1116.938421227" watchObservedRunningTime="2025-12-02 23:16:18.448526868 +0000 UTC m=+1117.157081151" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.606753 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgrq5\" (UniqueName: \"kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.607022 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.607054 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.607082 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.607347 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.607387 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config\") pod \"22c0b774-da33-4e2c-b263-78439094167c\" (UID: \"22c0b774-da33-4e2c-b263-78439094167c\") " Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.621353 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5" (OuterVolumeSpecName: "kube-api-access-cgrq5") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "kube-api-access-cgrq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.641560 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.644869 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config" (OuterVolumeSpecName: "config") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.650684 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.654569 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.655121 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22c0b774-da33-4e2c-b263-78439094167c" (UID: "22c0b774-da33-4e2c-b263-78439094167c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712252 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgrq5\" (UniqueName: \"kubernetes.io/projected/22c0b774-da33-4e2c-b263-78439094167c-kube-api-access-cgrq5\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712298 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712309 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712319 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712331 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.712450 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c0b774-da33-4e2c-b263-78439094167c-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:18 crc kubenswrapper[4903]: W1202 23:16:18.982278 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaa48302_9f85_4c4c_b61a_e2b0847d425a.slice/crio-2dbbc01f39e12b4cc23da4d6bf6248807a887cfc1fa0377fcf9022cd7f66f71c WatchSource:0}: Error finding container 2dbbc01f39e12b4cc23da4d6bf6248807a887cfc1fa0377fcf9022cd7f66f71c: Status 404 returned error can't find the container with id 2dbbc01f39e12b4cc23da4d6bf6248807a887cfc1fa0377fcf9022cd7f66f71c Dec 02 23:16:18 crc kubenswrapper[4903]: I1202 23:16:18.983879 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.172899 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55dff86b95-mnlkl" event={"ID":"caa48302-9f85-4c4c-b61a-e2b0847d425a","Type":"ContainerStarted","Data":"2dbbc01f39e12b4cc23da4d6bf6248807a887cfc1fa0377fcf9022cd7f66f71c"} Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.176845 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" event={"ID":"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba","Type":"ContainerStarted","Data":"8fd0af82da0677e3ab04f4fb685da1c72b385d4fa53ae4cbbc9ea5a6bd6331e9"} Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.178485 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.182994 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.183587 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569d967d67-xpnn8" event={"ID":"22c0b774-da33-4e2c-b263-78439094167c","Type":"ContainerDied","Data":"cd465cdbd0418d01c74edd454b9c84061ae09dcef1dcf63b16b0c9b72fb5598a"} Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.183694 4903 scope.go:117] "RemoveContainer" containerID="cda2c99dd5391edc36218fdcaea212e68d0e9d66405e093da36ecfafc90246da" Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.204034 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" podStartSLOduration=4.204007551 podStartE2EDuration="4.204007551s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:19.197496599 +0000 UTC m=+1117.906050892" watchObservedRunningTime="2025-12-02 23:16:19.204007551 +0000 UTC m=+1117.912561834" Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.282731 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.291977 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-569d967d67-xpnn8"] Dec 02 23:16:19 crc kubenswrapper[4903]: I1202 23:16:19.627786 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c0b774-da33-4e2c-b263-78439094167c" path="/var/lib/kubelet/pods/22c0b774-da33-4e2c-b263-78439094167c/volumes" Dec 02 23:16:20 crc kubenswrapper[4903]: I1202 23:16:20.196753 4903 generic.go:334] "Generic (PLEG): container finished" podID="25b38561-c9d6-4223-85cd-c4516718cc5f" containerID="db1bfd3257b5f1e05d3b208cdffd9b7e1e712fd70ba4924a81d8649ed5c89b20" exitCode=0 Dec 02 23:16:20 crc kubenswrapper[4903]: I1202 23:16:20.196845 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8sbng" event={"ID":"25b38561-c9d6-4223-85cd-c4516718cc5f","Type":"ContainerDied","Data":"db1bfd3257b5f1e05d3b208cdffd9b7e1e712fd70ba4924a81d8649ed5c89b20"} Dec 02 23:16:22 crc kubenswrapper[4903]: I1202 23:16:22.222500 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ab27c8a-19cb-461d-8055-71c04f10e553" containerID="1e1fe13234c90a5746f503e83c23d95610a319b063ef9da865233ab6b18b3b04" exitCode=0 Dec 02 23:16:22 crc kubenswrapper[4903]: I1202 23:16:22.223090 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t42nb" event={"ID":"3ab27c8a-19cb-461d-8055-71c04f10e553","Type":"ContainerDied","Data":"1e1fe13234c90a5746f503e83c23d95610a319b063ef9da865233ab6b18b3b04"} Dec 02 23:16:23 crc kubenswrapper[4903]: I1202 23:16:23.069401 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:16:23 crc kubenswrapper[4903]: I1202 23:16:23.069464 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.323234 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.356138 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:16:24 crc kubenswrapper[4903]: E1202 23:16:24.356629 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c0b774-da33-4e2c-b263-78439094167c" containerName="init" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.356666 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c0b774-da33-4e2c-b263-78439094167c" containerName="init" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.356884 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c0b774-da33-4e2c-b263-78439094167c" containerName="init" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.357902 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.360515 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.367550 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.412644 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.433995 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7857f5d94d-4lclz"] Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.435587 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.464020 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7857f5d94d-4lclz"] Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.547630 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-config-data\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548333 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548365 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-tls-certs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548455 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww58\" (UniqueName: \"kubernetes.io/projected/c5d26e7e-b21c-4e31-984f-768ef66e0772-kube-api-access-6ww58\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548486 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhl6\" (UniqueName: \"kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548621 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548813 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-combined-ca-bundle\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548841 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.548985 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.549026 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-secret-key\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.549098 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d26e7e-b21c-4e31-984f-768ef66e0772-logs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.549124 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-scripts\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650412 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww58\" (UniqueName: \"kubernetes.io/projected/c5d26e7e-b21c-4e31-984f-768ef66e0772-kube-api-access-6ww58\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650459 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhl6\" (UniqueName: \"kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650485 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650502 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650532 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-combined-ca-bundle\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650591 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-secret-key\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650633 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-scripts\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650660 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d26e7e-b21c-4e31-984f-768ef66e0772-logs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-config-data\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650739 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.650762 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-tls-certs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.657633 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-tls-certs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.658035 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.658556 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.659172 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.660051 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-scripts\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.660274 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d26e7e-b21c-4e31-984f-768ef66e0772-logs\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.668060 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d26e7e-b21c-4e31-984f-768ef66e0772-config-data\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.668607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.674228 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-horizon-secret-key\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.674517 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d26e7e-b21c-4e31-984f-768ef66e0772-combined-ca-bundle\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.678196 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.691610 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhl6\" (UniqueName: \"kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.695255 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww58\" (UniqueName: \"kubernetes.io/projected/c5d26e7e-b21c-4e31-984f-768ef66e0772-kube-api-access-6ww58\") pod \"horizon-7857f5d94d-4lclz\" (UID: \"c5d26e7e-b21c-4e31-984f-768ef66e0772\") " pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.696348 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs\") pod \"horizon-5fd47c645b-9wf6m\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.750541 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:16:24 crc kubenswrapper[4903]: I1202 23:16:24.985262 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:16:26 crc kubenswrapper[4903]: I1202 23:16:26.716337 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:16:26 crc kubenswrapper[4903]: I1202 23:16:26.784857 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:16:26 crc kubenswrapper[4903]: I1202 23:16:26.785138 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" containerID="cri-o://5415f53c54ee851d0c151c81707ee43d45c23b497ae231fcc5eb477aa9a61b99" gracePeriod=10 Dec 02 23:16:27 crc kubenswrapper[4903]: I1202 23:16:27.287383 4903 generic.go:334] "Generic (PLEG): container finished" podID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerID="5415f53c54ee851d0c151c81707ee43d45c23b497ae231fcc5eb477aa9a61b99" exitCode=0 Dec 02 23:16:27 crc kubenswrapper[4903]: I1202 23:16:27.287421 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" event={"ID":"8f7e4993-0dd2-413f-bd55-631ee71946c4","Type":"ContainerDied","Data":"5415f53c54ee851d0c151c81707ee43d45c23b497ae231fcc5eb477aa9a61b99"} Dec 02 23:16:28 crc kubenswrapper[4903]: I1202 23:16:28.139443 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Dec 02 23:16:33 crc kubenswrapper[4903]: I1202 23:16:33.139862 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Dec 02 23:16:36 crc kubenswrapper[4903]: E1202 23:16:36.735119 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 02 23:16:36 crc kubenswrapper[4903]: E1202 23:16:36.735462 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 02 23:16:36 crc kubenswrapper[4903]: E1202 23:16:36.735603 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.2:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mclrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-flhtr_openstack(5fdf589e-17a2-4b53-b68c-f90e884b0080): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:36 crc kubenswrapper[4903]: E1202 23:16:36.738431 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-flhtr" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.155427 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.155475 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.155576 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n97h597h566h56bhbh85h654h6bh547hcfh57fh697h5c4hd6h5dfh56dhf4h67bh8ch576h56bh578h5c6hd6h95h5cfh587h544hdbh5ffh5b5h65cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9nmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1a6cd769-825e-4700-a66b-87291af7f897): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.170975 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.171028 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.171182 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n577h67fh5f6h86h57dh687h5d8h67bh657h565h5bh5ffh594h665h57dhf4h5fh55fh78h5f4h88h5c9h8dh54h54fh65fh5d7h67bh5ddh687h5f4h556q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bggqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6844b8f8f-qb7dk_openstack(1b67e826-a529-4e61-af56-58f80fdc251c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.174016 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-6844b8f8f-qb7dk" podUID="1b67e826-a529-4e61-af56-58f80fdc251c" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.245120 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.388700 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8sbng" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.388612 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8sbng" event={"ID":"25b38561-c9d6-4223-85cd-c4516718cc5f","Type":"ContainerDied","Data":"3845b29bd6645b03a3c7cdb06bed509bed9bbdebbb0632d4b64acb7248fec083"} Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.388764 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3845b29bd6645b03a3c7cdb06bed509bed9bbdebbb0632d4b64acb7248fec083" Dec 02 23:16:37 crc kubenswrapper[4903]: E1202 23:16:37.391594 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-flhtr" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.425840 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xndgw\" (UniqueName: \"kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw\") pod \"25b38561-c9d6-4223-85cd-c4516718cc5f\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.426024 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data\") pod \"25b38561-c9d6-4223-85cd-c4516718cc5f\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.426050 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data\") pod \"25b38561-c9d6-4223-85cd-c4516718cc5f\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.426129 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle\") pod \"25b38561-c9d6-4223-85cd-c4516718cc5f\" (UID: \"25b38561-c9d6-4223-85cd-c4516718cc5f\") " Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.445687 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "25b38561-c9d6-4223-85cd-c4516718cc5f" (UID: "25b38561-c9d6-4223-85cd-c4516718cc5f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.451224 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw" (OuterVolumeSpecName: "kube-api-access-xndgw") pod "25b38561-c9d6-4223-85cd-c4516718cc5f" (UID: "25b38561-c9d6-4223-85cd-c4516718cc5f"). InnerVolumeSpecName "kube-api-access-xndgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.461826 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25b38561-c9d6-4223-85cd-c4516718cc5f" (UID: "25b38561-c9d6-4223-85cd-c4516718cc5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.482300 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data" (OuterVolumeSpecName: "config-data") pod "25b38561-c9d6-4223-85cd-c4516718cc5f" (UID: "25b38561-c9d6-4223-85cd-c4516718cc5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.528568 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.528613 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.528629 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b38561-c9d6-4223-85cd-c4516718cc5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:37 crc kubenswrapper[4903]: I1202 23:16:37.528643 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xndgw\" (UniqueName: \"kubernetes.io/projected/25b38561-c9d6-4223-85cd-c4516718cc5f-kube-api-access-xndgw\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.139217 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.139861 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.591617 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: E1202 23:16:38.592018 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b38561-c9d6-4223-85cd-c4516718cc5f" containerName="watcher-db-sync" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.592031 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b38561-c9d6-4223-85cd-c4516718cc5f" containerName="watcher-db-sync" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.592236 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b38561-c9d6-4223-85cd-c4516718cc5f" containerName="watcher-db-sync" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.593235 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.595423 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-7mmjm" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.595821 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.619596 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.640686 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.641821 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.648249 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.659965 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.724612 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.726804 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.728988 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.737623 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlst\" (UniqueName: \"kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749813 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749828 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749935 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px9sh\" (UniqueName: \"kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749955 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749979 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.749996 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.851992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852808 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852839 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852884 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852953 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-config-data\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.852989 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlst\" (UniqueName: \"kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853007 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853041 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853058 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9d4k\" (UniqueName: \"kubernetes.io/projected/6b000454-0ec3-4f51-ba7a-767530eaf03c-kube-api-access-n9d4k\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853130 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853202 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853243 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b000454-0ec3-4f51-ba7a-767530eaf03c-logs\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853302 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px9sh\" (UniqueName: \"kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.853430 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.855096 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.857169 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.858174 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.863877 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.869259 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px9sh\" (UniqueName: \"kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh\") pod \"watcher-decision-engine-0\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.876192 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.876300 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.879674 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.881703 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlst\" (UniqueName: \"kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst\") pod \"watcher-api-0\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.914352 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.954875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-config-data\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.955233 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9d4k\" (UniqueName: \"kubernetes.io/projected/6b000454-0ec3-4f51-ba7a-767530eaf03c-kube-api-access-n9d4k\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.955286 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.955364 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b000454-0ec3-4f51-ba7a-767530eaf03c-logs\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.955871 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b000454-0ec3-4f51-ba7a-767530eaf03c-logs\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.959323 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-config-data\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.960680 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b000454-0ec3-4f51-ba7a-767530eaf03c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.965893 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:16:38 crc kubenswrapper[4903]: I1202 23:16:38.976286 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9d4k\" (UniqueName: \"kubernetes.io/projected/6b000454-0ec3-4f51-ba7a-767530eaf03c-kube-api-access-n9d4k\") pod \"watcher-applier-0\" (UID: \"6b000454-0ec3-4f51-ba7a-767530eaf03c\") " pod="openstack/watcher-applier-0" Dec 02 23:16:39 crc kubenswrapper[4903]: I1202 23:16:39.085476 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.303763 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.303819 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.303926 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.2:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjj55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-bbljr_openstack(92c0a321-b591-49c3-a9b4-bc6b8bf30820): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.305097 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-bbljr" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.325383 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.325434 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.325545 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dch64bh79hb8h87h89h7h545h54h54ch5b6h5b5h646h5c9h5b7h54dhcbh647h6fh656h7fh5c6h654h688h55ch5cch549hfdh55hcchd8h5d8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prw5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55dff86b95-mnlkl_openstack(caa48302-9f85-4c4c-b61a-e2b0847d425a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.332409 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-55dff86b95-mnlkl" podUID="caa48302-9f85-4c4c-b61a-e2b0847d425a" Dec 02 23:16:39 crc kubenswrapper[4903]: E1202 23:16:39.416536 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-bbljr" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.210178 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.303744 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.303841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcds\" (UniqueName: \"kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.303873 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.304837 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.304888 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.304974 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys\") pod \"3ab27c8a-19cb-461d-8055-71c04f10e553\" (UID: \"3ab27c8a-19cb-461d-8055-71c04f10e553\") " Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.312335 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds" (OuterVolumeSpecName: "kube-api-access-jrcds") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "kube-api-access-jrcds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.313358 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.316235 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.359366 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data" (OuterVolumeSpecName: "config-data") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.359461 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts" (OuterVolumeSpecName: "scripts") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.359413 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ab27c8a-19cb-461d-8055-71c04f10e553" (UID: "3ab27c8a-19cb-461d-8055-71c04f10e553"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408788 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408841 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcds\" (UniqueName: \"kubernetes.io/projected/3ab27c8a-19cb-461d-8055-71c04f10e553-kube-api-access-jrcds\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408861 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408900 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408918 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.408933 4903 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ab27c8a-19cb-461d-8055-71c04f10e553-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.435715 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t42nb" Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.435646 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t42nb" event={"ID":"3ab27c8a-19cb-461d-8055-71c04f10e553","Type":"ContainerDied","Data":"89810c7f6f4530772e21850699dfa285bb6ca47563fe755b568baf1cec7cff97"} Dec 02 23:16:41 crc kubenswrapper[4903]: I1202 23:16:41.435847 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89810c7f6f4530772e21850699dfa285bb6ca47563fe755b568baf1cec7cff97" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.339101 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t42nb"] Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.350486 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t42nb"] Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.393382 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bhns8"] Dec 02 23:16:42 crc kubenswrapper[4903]: E1202 23:16:42.393926 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab27c8a-19cb-461d-8055-71c04f10e553" containerName="keystone-bootstrap" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.393949 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab27c8a-19cb-461d-8055-71c04f10e553" containerName="keystone-bootstrap" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.394181 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab27c8a-19cb-461d-8055-71c04f10e553" containerName="keystone-bootstrap" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.395547 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.399350 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.399368 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.399376 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bzv6f" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.399424 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.402188 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.420034 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bhns8"] Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.534381 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.534522 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.534563 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.534800 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.534881 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.535008 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbn7\" (UniqueName: \"kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637244 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637391 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637441 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637559 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.637597 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbn7\" (UniqueName: \"kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.645203 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.645240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.645204 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.645389 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.645739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.653762 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbn7\" (UniqueName: \"kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7\") pod \"keystone-bootstrap-bhns8\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:42 crc kubenswrapper[4903]: I1202 23:16:42.718799 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:16:43 crc kubenswrapper[4903]: I1202 23:16:43.621544 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab27c8a-19cb-461d-8055-71c04f10e553" path="/var/lib/kubelet/pods/3ab27c8a-19cb-461d-8055-71c04f10e553/volumes" Dec 02 23:16:44 crc kubenswrapper[4903]: I1202 23:16:44.466348 4903 generic.go:334] "Generic (PLEG): container finished" podID="9f91dd36-7138-43f1-8091-cca7410520cf" containerID="b1e2d008ad53752077e45a6af27982f4a516161fb2dd9d442aa386d308ca3f1d" exitCode=0 Dec 02 23:16:44 crc kubenswrapper[4903]: I1202 23:16:44.466455 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hq8" event={"ID":"9f91dd36-7138-43f1-8091-cca7410520cf","Type":"ContainerDied","Data":"b1e2d008ad53752077e45a6af27982f4a516161fb2dd9d442aa386d308ca3f1d"} Dec 02 23:16:48 crc kubenswrapper[4903]: I1202 23:16:48.139639 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.070286 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.070716 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.070772 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.071674 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.071742 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7" gracePeriod=600 Dec 02 23:16:53 crc kubenswrapper[4903]: I1202 23:16:53.140477 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Dec 02 23:16:54 crc kubenswrapper[4903]: I1202 23:16:54.567780 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7" exitCode=0 Dec 02 23:16:54 crc kubenswrapper[4903]: I1202 23:16:54.567815 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7"} Dec 02 23:16:54 crc kubenswrapper[4903]: I1202 23:16:54.568217 4903 scope.go:117] "RemoveContainer" containerID="b388d9306dbbf4e8d5e39ed68ca73568f5d6116247546f21f87b8605b912cb3c" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.745326 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.752433 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800441 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts\") pod \"caa48302-9f85-4c4c-b61a-e2b0847d425a\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800504 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts\") pod \"1b67e826-a529-4e61-af56-58f80fdc251c\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800549 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key\") pod \"caa48302-9f85-4c4c-b61a-e2b0847d425a\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800616 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data\") pod \"1b67e826-a529-4e61-af56-58f80fdc251c\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800682 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key\") pod \"1b67e826-a529-4e61-af56-58f80fdc251c\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800698 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prw5t\" (UniqueName: \"kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t\") pod \"caa48302-9f85-4c4c-b61a-e2b0847d425a\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800830 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs\") pod \"caa48302-9f85-4c4c-b61a-e2b0847d425a\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800881 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data\") pod \"caa48302-9f85-4c4c-b61a-e2b0847d425a\" (UID: \"caa48302-9f85-4c4c-b61a-e2b0847d425a\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800902 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs\") pod \"1b67e826-a529-4e61-af56-58f80fdc251c\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.800957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bggqm\" (UniqueName: \"kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm\") pod \"1b67e826-a529-4e61-af56-58f80fdc251c\" (UID: \"1b67e826-a529-4e61-af56-58f80fdc251c\") " Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.801300 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts" (OuterVolumeSpecName: "scripts") pod "caa48302-9f85-4c4c-b61a-e2b0847d425a" (UID: "caa48302-9f85-4c4c-b61a-e2b0847d425a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.801334 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts" (OuterVolumeSpecName: "scripts") pod "1b67e826-a529-4e61-af56-58f80fdc251c" (UID: "1b67e826-a529-4e61-af56-58f80fdc251c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.801494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data" (OuterVolumeSpecName: "config-data") pod "caa48302-9f85-4c4c-b61a-e2b0847d425a" (UID: "caa48302-9f85-4c4c-b61a-e2b0847d425a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.801998 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data" (OuterVolumeSpecName: "config-data") pod "1b67e826-a529-4e61-af56-58f80fdc251c" (UID: "1b67e826-a529-4e61-af56-58f80fdc251c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.802091 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs" (OuterVolumeSpecName: "logs") pod "caa48302-9f85-4c4c-b61a-e2b0847d425a" (UID: "caa48302-9f85-4c4c-b61a-e2b0847d425a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.802489 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs" (OuterVolumeSpecName: "logs") pod "1b67e826-a529-4e61-af56-58f80fdc251c" (UID: "1b67e826-a529-4e61-af56-58f80fdc251c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.806798 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm" (OuterVolumeSpecName: "kube-api-access-bggqm") pod "1b67e826-a529-4e61-af56-58f80fdc251c" (UID: "1b67e826-a529-4e61-af56-58f80fdc251c"). InnerVolumeSpecName "kube-api-access-bggqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.806831 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t" (OuterVolumeSpecName: "kube-api-access-prw5t") pod "caa48302-9f85-4c4c-b61a-e2b0847d425a" (UID: "caa48302-9f85-4c4c-b61a-e2b0847d425a"). InnerVolumeSpecName "kube-api-access-prw5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.818289 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "caa48302-9f85-4c4c-b61a-e2b0847d425a" (UID: "caa48302-9f85-4c4c-b61a-e2b0847d425a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.818596 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1b67e826-a529-4e61-af56-58f80fdc251c" (UID: "1b67e826-a529-4e61-af56-58f80fdc251c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902706 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bggqm\" (UniqueName: \"kubernetes.io/projected/1b67e826-a529-4e61-af56-58f80fdc251c-kube-api-access-bggqm\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902742 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902751 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902760 4903 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/caa48302-9f85-4c4c-b61a-e2b0847d425a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902768 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b67e826-a529-4e61-af56-58f80fdc251c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902776 4903 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b67e826-a529-4e61-af56-58f80fdc251c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902784 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prw5t\" (UniqueName: \"kubernetes.io/projected/caa48302-9f85-4c4c-b61a-e2b0847d425a-kube-api-access-prw5t\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902791 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caa48302-9f85-4c4c-b61a-e2b0847d425a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902801 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caa48302-9f85-4c4c-b61a-e2b0847d425a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:55 crc kubenswrapper[4903]: I1202 23:16:55.902810 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b67e826-a529-4e61-af56-58f80fdc251c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: E1202 23:16:56.262906 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 02 23:16:56 crc kubenswrapper[4903]: E1202 23:16:56.262973 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 02 23:16:56 crc kubenswrapper[4903]: E1202 23:16:56.263123 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.2:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzbg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tv6h2_openstack(98900f75-26e7-46cb-a70e-537fa0486fe8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:56 crc kubenswrapper[4903]: E1202 23:16:56.264294 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tv6h2" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.345974 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.352275 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411564 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle\") pod \"9f91dd36-7138-43f1-8091-cca7410520cf\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411635 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxcn\" (UniqueName: \"kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn\") pod \"9f91dd36-7138-43f1-8091-cca7410520cf\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411682 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411718 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwndx\" (UniqueName: \"kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.411899 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.412020 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.412058 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0\") pod \"8f7e4993-0dd2-413f-bd55-631ee71946c4\" (UID: \"8f7e4993-0dd2-413f-bd55-631ee71946c4\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.412085 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config\") pod \"9f91dd36-7138-43f1-8091-cca7410520cf\" (UID: \"9f91dd36-7138-43f1-8091-cca7410520cf\") " Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.422757 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx" (OuterVolumeSpecName: "kube-api-access-mwndx") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "kube-api-access-mwndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.423075 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn" (OuterVolumeSpecName: "kube-api-access-msxcn") pod "9f91dd36-7138-43f1-8091-cca7410520cf" (UID: "9f91dd36-7138-43f1-8091-cca7410520cf"). InnerVolumeSpecName "kube-api-access-msxcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.462788 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f91dd36-7138-43f1-8091-cca7410520cf" (UID: "9f91dd36-7138-43f1-8091-cca7410520cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.467136 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.477862 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config" (OuterVolumeSpecName: "config") pod "9f91dd36-7138-43f1-8091-cca7410520cf" (UID: "9f91dd36-7138-43f1-8091-cca7410520cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.478261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.484450 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.495212 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.497304 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config" (OuterVolumeSpecName: "config") pod "8f7e4993-0dd2-413f-bd55-631ee71946c4" (UID: "8f7e4993-0dd2-413f-bd55-631ee71946c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516501 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516543 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516561 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwndx\" (UniqueName: \"kubernetes.io/projected/8f7e4993-0dd2-413f-bd55-631ee71946c4-kube-api-access-mwndx\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516572 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516583 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516593 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f7e4993-0dd2-413f-bd55-631ee71946c4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516605 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516616 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f91dd36-7138-43f1-8091-cca7410520cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.516627 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxcn\" (UniqueName: \"kubernetes.io/projected/9f91dd36-7138-43f1-8091-cca7410520cf-kube-api-access-msxcn\") on node \"crc\" DevicePath \"\"" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.588248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55dff86b95-mnlkl" event={"ID":"caa48302-9f85-4c4c-b61a-e2b0847d425a","Type":"ContainerDied","Data":"2dbbc01f39e12b4cc23da4d6bf6248807a887cfc1fa0377fcf9022cd7f66f71c"} Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.588256 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55dff86b95-mnlkl" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.598218 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6844b8f8f-qb7dk" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.598220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6844b8f8f-qb7dk" event={"ID":"1b67e826-a529-4e61-af56-58f80fdc251c","Type":"ContainerDied","Data":"786d7714453437e5026249abd6262e19b9507f861b797465d68e5a38f91f7024"} Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.602361 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hq8" event={"ID":"9f91dd36-7138-43f1-8091-cca7410520cf","Type":"ContainerDied","Data":"70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5"} Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.602396 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f0d29946f6c4bdc2122a20e5eadb3cf62e01b3534c78f5e16511310f73d4d5" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.602471 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hq8" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.606984 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.607022 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" event={"ID":"8f7e4993-0dd2-413f-bd55-631ee71946c4","Type":"ContainerDied","Data":"c799b5ccd4ac2748e403f7dc781e36656dce5f6d86361e434d5d1c58910683d3"} Dec 02 23:16:56 crc kubenswrapper[4903]: E1202 23:16:56.608211 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-tv6h2" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.695982 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.706711 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6844b8f8f-qb7dk"] Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.743515 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.758741 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55dff86b95-mnlkl"] Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.764962 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:16:56 crc kubenswrapper[4903]: I1202 23:16:56.771062 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk"] Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.615098 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.615366 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.615492 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.2:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctbjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nsw82_openstack(ecdb8e0b-8b04-4dc5-b532-8e68e8206122): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.617725 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nsw82" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.630093 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b67e826-a529-4e61-af56-58f80fdc251c" path="/var/lib/kubelet/pods/1b67e826-a529-4e61-af56-58f80fdc251c/volumes" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.630620 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" path="/var/lib/kubelet/pods/8f7e4993-0dd2-413f-bd55-631ee71946c4/volumes" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.631624 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa48302-9f85-4c4c-b61a-e2b0847d425a" path="/var/lib/kubelet/pods/caa48302-9f85-4c4c-b61a-e2b0847d425a/volumes" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701344 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.701736 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f91dd36-7138-43f1-8091-cca7410520cf" containerName="neutron-db-sync" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701748 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f91dd36-7138-43f1-8091-cca7410520cf" containerName="neutron-db-sync" Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.701773 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701779 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" Dec 02 23:16:57 crc kubenswrapper[4903]: E1202 23:16:57.701792 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="init" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701798 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="init" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701945 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f91dd36-7138-43f1-8091-cca7410520cf" containerName="neutron-db-sync" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.701959 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.702919 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.715509 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752143 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752274 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752311 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752343 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c97v5\" (UniqueName: \"kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.752413 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854713 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854763 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854785 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c97v5\" (UniqueName: \"kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854829 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.854904 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.855774 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.855802 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.856378 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.856795 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.856896 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.888825 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c97v5\" (UniqueName: \"kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5\") pod \"dnsmasq-dns-76ff5f5497-x5nsj\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.900861 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.902504 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.904959 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.905130 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.905259 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.905336 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-46qz6" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.912491 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.956412 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.956454 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkfwt\" (UniqueName: \"kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.956496 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.956590 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:57 crc kubenswrapper[4903]: I1202 23:16:57.956607 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.058349 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.058401 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkfwt\" (UniqueName: \"kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.058437 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.058574 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.058597 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.062489 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.062731 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.063179 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.073449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.073983 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkfwt\" (UniqueName: \"kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt\") pod \"neutron-77d59fbfc6-p8t8m\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.118132 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.141192 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d7cc4fbd9-qf9dk" podUID="8f7e4993-0dd2-413f-bd55-631ee71946c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Dec 02 23:16:58 crc kubenswrapper[4903]: E1202 23:16:58.169084 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest" Dec 02 23:16:58 crc kubenswrapper[4903]: E1202 23:16:58.169140 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest" Dec 02 23:16:58 crc kubenswrapper[4903]: E1202 23:16:58.169271 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n97h597h566h56bhbh85h654h6bh547hcfh57fh697h5c4hd6h5dfh56dhf4h67bh8ch576h56bh578h5c6hd6h95h5cfh587h544hdbh5ffh5b5h65cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9nmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1a6cd769-825e-4700-a66b-87291af7f897): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.239279 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.320603 4903 scope.go:117] "RemoveContainer" containerID="5415f53c54ee851d0c151c81707ee43d45c23b497ae231fcc5eb477aa9a61b99" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.461694 4903 scope.go:117] "RemoveContainer" containerID="bb45ac70d3a4787c1f5ead0fa22243cc6dde0eb12bd8b6194ca868905447a0bd" Dec 02 23:16:58 crc kubenswrapper[4903]: E1202 23:16:58.730354 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-nsw82" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.751337 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.832424 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:16:58 crc kubenswrapper[4903]: W1202 23:16:58.860814 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de4a117_0c91_47f4_a80d_278debb3ea60.slice/crio-9642229d3deab857dde02054ecb841a3af38cf34195214e12a248261a22ea20b WatchSource:0}: Error finding container 9642229d3deab857dde02054ecb841a3af38cf34195214e12a248261a22ea20b: Status 404 returned error can't find the container with id 9642229d3deab857dde02054ecb841a3af38cf34195214e12a248261a22ea20b Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.954711 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 23:16:58 crc kubenswrapper[4903]: W1202 23:16:58.978055 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b000454_0ec3_4f51_ba7a_767530eaf03c.slice/crio-529dfaf07acc0db4111aa843b03949450df986f5cc3fed341f9273c1d7ecee4f WatchSource:0}: Error finding container 529dfaf07acc0db4111aa843b03949450df986f5cc3fed341f9273c1d7ecee4f: Status 404 returned error can't find the container with id 529dfaf07acc0db4111aa843b03949450df986f5cc3fed341f9273c1d7ecee4f Dec 02 23:16:58 crc kubenswrapper[4903]: I1202 23:16:58.984698 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7857f5d94d-4lclz"] Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.027357 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:16:59 crc kubenswrapper[4903]: W1202 23:16:59.061061 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c7396b_f60c_437a_ad2d_f6c19b7c4570.slice/crio-680c09ac16d87abb0e84624fa6345d54da2d24734ea59c4bd4504fbf7be2e2dc WatchSource:0}: Error finding container 680c09ac16d87abb0e84624fa6345d54da2d24734ea59c4bd4504fbf7be2e2dc: Status 404 returned error can't find the container with id 680c09ac16d87abb0e84624fa6345d54da2d24734ea59c4bd4504fbf7be2e2dc Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.110389 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bhns8"] Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.240173 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:16:59 crc kubenswrapper[4903]: W1202 23:16:59.264808 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8af477b6_59d4_4909_9cb6_b9e61f75bd96.slice/crio-d8837153c3ae62ed7623c378948ba876860e8d71e1f9a55adb4a2067ddf58cc1 WatchSource:0}: Error finding container d8837153c3ae62ed7623c378948ba876860e8d71e1f9a55adb4a2067ddf58cc1: Status 404 returned error can't find the container with id d8837153c3ae62ed7623c378948ba876860e8d71e1f9a55adb4a2067ddf58cc1 Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.668067 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.752643 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-flhtr" event={"ID":"5fdf589e-17a2-4b53-b68c-f90e884b0080","Type":"ContainerStarted","Data":"1bce8edcc376cc44be897d0c5ea65e04b8d457cc31894966db288539082aa040"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.756777 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b000454-0ec3-4f51-ba7a-767530eaf03c","Type":"ContainerStarted","Data":"529dfaf07acc0db4111aa843b03949450df986f5cc3fed341f9273c1d7ecee4f"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.759939 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"9642229d3deab857dde02054ecb841a3af38cf34195214e12a248261a22ea20b"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.763354 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerStarted","Data":"ff4cd682768dc12d2eded8b67cc51f76bfcf5cb8ca4446a9de2565286d75ea15"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.763380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerStarted","Data":"680c09ac16d87abb0e84624fa6345d54da2d24734ea59c4bd4504fbf7be2e2dc"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.765502 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bhns8" event={"ID":"976c6f69-733c-4046-85e0-d10c9d902a22","Type":"ContainerStarted","Data":"5c3d7d10b0a60c8b39f1ec55f86abf1ac102b636038fcd4ff5a22eedb548f872"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.765527 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bhns8" event={"ID":"976c6f69-733c-4046-85e0-d10c9d902a22","Type":"ContainerStarted","Data":"e6b09e5dbb4c8aab4eda986abb486dd13b545b4901e1f46e88f85f23e101f6f7"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.773488 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-flhtr" podStartSLOduration=6.082255158 podStartE2EDuration="49.773440964s" podCreationTimestamp="2025-12-02 23:16:10 +0000 UTC" firstStartedPulling="2025-12-02 23:16:14.613198919 +0000 UTC m=+1113.321753202" lastFinishedPulling="2025-12-02 23:16:58.304384725 +0000 UTC m=+1157.012939008" observedRunningTime="2025-12-02 23:16:59.771048539 +0000 UTC m=+1158.479602822" watchObservedRunningTime="2025-12-02 23:16:59.773440964 +0000 UTC m=+1158.481995287" Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.780324 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerStarted","Data":"0b81c9432e70708289bdb623e40b54cea53b982e059cd522ef6eac0133e68111"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.804132 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bhns8" podStartSLOduration=17.804112982 podStartE2EDuration="17.804112982s" podCreationTimestamp="2025-12-02 23:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:59.78948303 +0000 UTC m=+1158.498037313" watchObservedRunningTime="2025-12-02 23:16:59.804112982 +0000 UTC m=+1158.512667265" Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.805990 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerStarted","Data":"6ef85280faeb898a19ebaf725c3f47ef89cc2ff15b7a801d95fdd8542b49a49b"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.806032 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerStarted","Data":"d8837153c3ae62ed7623c378948ba876860e8d71e1f9a55adb4a2067ddf58cc1"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.818493 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bbljr" event={"ID":"92c0a321-b591-49c3-a9b4-bc6b8bf30820","Type":"ContainerStarted","Data":"d179cc0a9b1b2a14d63276f5a6fb227b062e6d5252e8870e3f259cdeadee12f5"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.834261 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7857f5d94d-4lclz" event={"ID":"c5d26e7e-b21c-4e31-984f-768ef66e0772","Type":"ContainerStarted","Data":"ddf5d9c6ea912415e490d7645108e394a709731aee0be5fdae8a0f9a672d5ab8"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.834306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7857f5d94d-4lclz" event={"ID":"c5d26e7e-b21c-4e31-984f-768ef66e0772","Type":"ContainerStarted","Data":"bbdcd47582b8f35f467d1c87bdad0d3cde0f5aa8fac904ba4c0cd6ed1d330881"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.851615 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.885212 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bbljr" podStartSLOduration=3.07178765 podStartE2EDuration="43.885190898s" podCreationTimestamp="2025-12-02 23:16:16 +0000 UTC" firstStartedPulling="2025-12-02 23:16:17.450998203 +0000 UTC m=+1116.159552486" lastFinishedPulling="2025-12-02 23:16:58.264401451 +0000 UTC m=+1156.972955734" observedRunningTime="2025-12-02 23:16:59.846000741 +0000 UTC m=+1158.554555024" watchObservedRunningTime="2025-12-02 23:16:59.885190898 +0000 UTC m=+1158.593745181" Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.885969 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerStarted","Data":"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.886019 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerStarted","Data":"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.886164 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c5c474cdf-n98r7" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon-log" containerID="cri-o://ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" gracePeriod=30 Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.886579 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c5c474cdf-n98r7" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon" containerID="cri-o://07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" gracePeriod=30 Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.890434 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7857f5d94d-4lclz" podStartSLOduration=35.89041648 podStartE2EDuration="35.89041648s" podCreationTimestamp="2025-12-02 23:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:59.883033338 +0000 UTC m=+1158.591587641" watchObservedRunningTime="2025-12-02 23:16:59.89041648 +0000 UTC m=+1158.598970763" Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.922753 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerStarted","Data":"ec86184f533730ef740f956a3a55cdef5876deb56101395eba447b01c1c43ce9"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.922807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerStarted","Data":"a79d667890f2549c62842ff723ef2d42ac1b121ce0e74d30f4d1ca66e7226567"} Dec 02 23:16:59 crc kubenswrapper[4903]: I1202 23:16:59.958477 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c5c474cdf-n98r7" podStartSLOduration=5.88123732 podStartE2EDuration="44.958446231s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="2025-12-02 23:16:17.209727302 +0000 UTC m=+1115.918281585" lastFinishedPulling="2025-12-02 23:16:56.286936213 +0000 UTC m=+1154.995490496" observedRunningTime="2025-12-02 23:16:59.923991985 +0000 UTC m=+1158.632546268" watchObservedRunningTime="2025-12-02 23:16:59.958446231 +0000 UTC m=+1158.667000514" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.052786 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fd47c645b-9wf6m" podStartSLOduration=36.052767486 podStartE2EDuration="36.052767486s" podCreationTimestamp="2025-12-02 23:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:16:59.94560525 +0000 UTC m=+1158.654159523" watchObservedRunningTime="2025-12-02 23:17:00.052767486 +0000 UTC m=+1158.761321769" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.671510 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56f9c8dcd5-hbd9l"] Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.673690 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.679535 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.679749 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.702559 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f9c8dcd5-hbd9l"] Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788358 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-internal-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788440 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95x7g\" (UniqueName: \"kubernetes.io/projected/c7517345-0440-461c-a78d-a29ef04ecf9c-kube-api-access-95x7g\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-ovndb-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788610 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788670 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-combined-ca-bundle\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.788810 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-public-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.789516 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-httpd-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891133 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-httpd-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891487 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-internal-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891539 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95x7g\" (UniqueName: \"kubernetes.io/projected/c7517345-0440-461c-a78d-a29ef04ecf9c-kube-api-access-95x7g\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891587 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-ovndb-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891639 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891691 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-combined-ca-bundle\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.891751 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-public-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.898220 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-combined-ca-bundle\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.902773 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-httpd-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.903239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-ovndb-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.903708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-internal-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.904960 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-public-tls-certs\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.909432 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7517345-0440-461c-a78d-a29ef04ecf9c-config\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.914218 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95x7g\" (UniqueName: \"kubernetes.io/projected/c7517345-0440-461c-a78d-a29ef04ecf9c-kube-api-access-95x7g\") pod \"neutron-56f9c8dcd5-hbd9l\" (UID: \"c7517345-0440-461c-a78d-a29ef04ecf9c\") " pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.988489 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerStarted","Data":"86ae26bea8a462ef5ecd692e761bc008aa6ef25d86a1531f8bb23e4072c22cf2"} Dec 02 23:17:00 crc kubenswrapper[4903]: I1202 23:17:00.990913 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.001283 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerStarted","Data":"a436f02befb85b1592a3221ab644954bf4f4e6026430927df464c7bfa7562de5"} Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.007373 4903 generic.go:334] "Generic (PLEG): container finished" podID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerID="6ef85280faeb898a19ebaf725c3f47ef89cc2ff15b7a801d95fdd8542b49a49b" exitCode=0 Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.007439 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerDied","Data":"6ef85280faeb898a19ebaf725c3f47ef89cc2ff15b7a801d95fdd8542b49a49b"} Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.007469 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerStarted","Data":"920bbcefa89e3ce771b253c8a556db24084880f816bbb44221a4d9a556596686"} Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.007882 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.028539 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerStarted","Data":"9bc1786dafda95c7a91b9d71ba47a1486b6b698cb2827edeca4b558247096a06"} Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.033018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7857f5d94d-4lclz" event={"ID":"c5d26e7e-b21c-4e31-984f-768ef66e0772","Type":"ContainerStarted","Data":"ae17b0b332034cdba884e5f577a312731139cee69c955f1e16db32245c91d577"} Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.051050 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.097475 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=23.097445893 podStartE2EDuration="23.097445893s" podCreationTimestamp="2025-12-02 23:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:01.01779211 +0000 UTC m=+1159.726346393" watchObservedRunningTime="2025-12-02 23:17:01.097445893 +0000 UTC m=+1159.806000176" Dec 02 23:17:01 crc kubenswrapper[4903]: I1202 23:17:01.106893 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" podStartSLOduration=4.106870163 podStartE2EDuration="4.106870163s" podCreationTimestamp="2025-12-02 23:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:01.060743535 +0000 UTC m=+1159.769297828" watchObservedRunningTime="2025-12-02 23:17:01.106870163 +0000 UTC m=+1159.815424446" Dec 02 23:17:03 crc kubenswrapper[4903]: I1202 23:17:03.057460 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:03 crc kubenswrapper[4903]: I1202 23:17:03.914939 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.065405 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.751286 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.752698 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.870862 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.985816 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:04 crc kubenswrapper[4903]: I1202 23:17:04.986161 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:06 crc kubenswrapper[4903]: I1202 23:17:06.397484 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:17:07 crc kubenswrapper[4903]: I1202 23:17:07.102571 4903 generic.go:334] "Generic (PLEG): container finished" podID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" containerID="d179cc0a9b1b2a14d63276f5a6fb227b062e6d5252e8870e3f259cdeadee12f5" exitCode=0 Dec 02 23:17:07 crc kubenswrapper[4903]: I1202 23:17:07.102629 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bbljr" event={"ID":"92c0a321-b591-49c3-a9b4-bc6b8bf30820","Type":"ContainerDied","Data":"d179cc0a9b1b2a14d63276f5a6fb227b062e6d5252e8870e3f259cdeadee12f5"} Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.116052 4903 generic.go:334] "Generic (PLEG): container finished" podID="976c6f69-733c-4046-85e0-d10c9d902a22" containerID="5c3d7d10b0a60c8b39f1ec55f86abf1ac102b636038fcd4ff5a22eedb548f872" exitCode=0 Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.116132 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bhns8" event={"ID":"976c6f69-733c-4046-85e0-d10c9d902a22","Type":"ContainerDied","Data":"5c3d7d10b0a60c8b39f1ec55f86abf1ac102b636038fcd4ff5a22eedb548f872"} Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.120782 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.213118 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.213407 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="dnsmasq-dns" containerID="cri-o://8fd0af82da0677e3ab04f4fb685da1c72b385d4fa53ae4cbbc9ea5a6bd6331e9" gracePeriod=10 Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.921837 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 23:17:08 crc kubenswrapper[4903]: I1202 23:17:08.950028 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bbljr" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.059017 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts\") pod \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.059302 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs\") pod \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.059357 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle\") pod \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.059433 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data\") pod \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.059459 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55\") pod \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\" (UID: \"92c0a321-b591-49c3-a9b4-bc6b8bf30820\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.071248 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs" (OuterVolumeSpecName: "logs") pod "92c0a321-b591-49c3-a9b4-bc6b8bf30820" (UID: "92c0a321-b591-49c3-a9b4-bc6b8bf30820"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.099132 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55" (OuterVolumeSpecName: "kube-api-access-cjj55") pod "92c0a321-b591-49c3-a9b4-bc6b8bf30820" (UID: "92c0a321-b591-49c3-a9b4-bc6b8bf30820"). InnerVolumeSpecName "kube-api-access-cjj55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.104559 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts" (OuterVolumeSpecName: "scripts") pod "92c0a321-b591-49c3-a9b4-bc6b8bf30820" (UID: "92c0a321-b591-49c3-a9b4-bc6b8bf30820"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.105985 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.154351 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data" (OuterVolumeSpecName: "config-data") pod "92c0a321-b591-49c3-a9b4-bc6b8bf30820" (UID: "92c0a321-b591-49c3-a9b4-bc6b8bf30820"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.162434 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.162460 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/92c0a321-b591-49c3-a9b4-bc6b8bf30820-kube-api-access-cjj55\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.162470 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.162480 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92c0a321-b591-49c3-a9b4-bc6b8bf30820-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.170712 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92c0a321-b591-49c3-a9b4-bc6b8bf30820" (UID: "92c0a321-b591-49c3-a9b4-bc6b8bf30820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.181983 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerID="8fd0af82da0677e3ab04f4fb685da1c72b385d4fa53ae4cbbc9ea5a6bd6331e9" exitCode=0 Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.182075 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" event={"ID":"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba","Type":"ContainerDied","Data":"8fd0af82da0677e3ab04f4fb685da1c72b385d4fa53ae4cbbc9ea5a6bd6331e9"} Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.205505 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bbljr" event={"ID":"92c0a321-b591-49c3-a9b4-bc6b8bf30820","Type":"ContainerDied","Data":"86cd0e4d6b3695596de0101f2e57bf7d57771941aa457939ca0834bb52223f8a"} Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.205961 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cd0e4d6b3695596de0101f2e57bf7d57771941aa457939ca0834bb52223f8a" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.208961 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bbljr" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.251973 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.257738 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-665fcbdbd4-lvt55"] Dec 02 23:17:09 crc kubenswrapper[4903]: E1202 23:17:09.258277 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" containerName="placement-db-sync" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.258296 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" containerName="placement-db-sync" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.258516 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" containerName="placement-db-sync" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.263171 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-scripts\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266341 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-config-data\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266395 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-internal-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkd6j\" (UniqueName: \"kubernetes.io/projected/4b492cef-e99c-4d41-a42b-7377908b5eed-kube-api-access-rkd6j\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266510 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-public-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266540 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b492cef-e99c-4d41-a42b-7377908b5eed-logs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266579 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-combined-ca-bundle\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.266619 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c0a321-b591-49c3-a9b4-bc6b8bf30820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.270918 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.271071 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.271158 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.271235 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9m6gd" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.271320 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.297715 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-665fcbdbd4-lvt55"] Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368200 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-config-data\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368277 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-internal-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkd6j\" (UniqueName: \"kubernetes.io/projected/4b492cef-e99c-4d41-a42b-7377908b5eed-kube-api-access-rkd6j\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368371 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-public-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368398 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b492cef-e99c-4d41-a42b-7377908b5eed-logs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368424 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-combined-ca-bundle\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.368446 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-scripts\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.400277 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-public-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.400277 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-combined-ca-bundle\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.417621 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-scripts\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.419901 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b492cef-e99c-4d41-a42b-7377908b5eed-logs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.420376 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-config-data\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.427350 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkd6j\" (UniqueName: \"kubernetes.io/projected/4b492cef-e99c-4d41-a42b-7377908b5eed-kube-api-access-rkd6j\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.431634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b492cef-e99c-4d41-a42b-7377908b5eed-internal-tls-certs\") pod \"placement-665fcbdbd4-lvt55\" (UID: \"4b492cef-e99c-4d41-a42b-7377908b5eed\") " pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.489205 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.585835 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.586155 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l5hf\" (UniqueName: \"kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.600817 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf" (OuterVolumeSpecName: "kube-api-access-4l5hf") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "kube-api-access-4l5hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.650427 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687221 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687274 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687341 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687473 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb\") pod \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\" (UID: \"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687818 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l5hf\" (UniqueName: \"kubernetes.io/projected/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-kube-api-access-4l5hf\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.687833 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.700084 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.733227 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.799052 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f9c8dcd5-hbd9l"] Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.807665 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config" (OuterVolumeSpecName: "config") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.824619 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.833964 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894743 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrbn7\" (UniqueName: \"kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894862 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894907 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.894970 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle\") pod \"976c6f69-733c-4046-85e0-d10c9d902a22\" (UID: \"976c6f69-733c-4046-85e0-d10c9d902a22\") " Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.905499 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.906584 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts" (OuterVolumeSpecName: "scripts") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.912827 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7" (OuterVolumeSpecName: "kube-api-access-xrbn7") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "kube-api-access-xrbn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.914470 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.914571 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.921071 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrbn7\" (UniqueName: \"kubernetes.io/projected/976c6f69-733c-4046-85e0-d10c9d902a22-kube-api-access-xrbn7\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.921184 4903 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.921272 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.921359 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.917238 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.954282 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:09 crc kubenswrapper[4903]: I1202 23:17:09.971531 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data" (OuterVolumeSpecName: "config-data") pod "976c6f69-733c-4046-85e0-d10c9d902a22" (UID: "976c6f69-733c-4046-85e0-d10c9d902a22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.025205 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.025231 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.025241 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976c6f69-733c-4046-85e0-d10c9d902a22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.047343 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" (UID: "2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.128149 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.240756 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-74c5f59c6f-5gx9d"] Dec 02 23:17:10 crc kubenswrapper[4903]: E1202 23:17:10.241188 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976c6f69-733c-4046-85e0-d10c9d902a22" containerName="keystone-bootstrap" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.241201 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="976c6f69-733c-4046-85e0-d10c9d902a22" containerName="keystone-bootstrap" Dec 02 23:17:10 crc kubenswrapper[4903]: E1202 23:17:10.241223 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="init" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.241229 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="init" Dec 02 23:17:10 crc kubenswrapper[4903]: E1202 23:17:10.241243 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="dnsmasq-dns" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.241249 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="dnsmasq-dns" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.241443 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="976c6f69-733c-4046-85e0-d10c9d902a22" containerName="keystone-bootstrap" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.241454 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" containerName="dnsmasq-dns" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.242114 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.246362 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.246453 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.266153 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74c5f59c6f-5gx9d"] Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.276924 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" event={"ID":"2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba","Type":"ContainerDied","Data":"21e7d076b1c427c9541c42a90a9bb983a7fd5fe98c6f86a4ca14f257a105b87b"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.276973 4903 scope.go:117] "RemoveContainer" containerID="8fd0af82da0677e3ab04f4fb685da1c72b385d4fa53ae4cbbc9ea5a6bd6331e9" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.277102 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db6798dff-kcjng" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.289012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bhns8" event={"ID":"976c6f69-733c-4046-85e0-d10c9d902a22","Type":"ContainerDied","Data":"e6b09e5dbb4c8aab4eda986abb486dd13b545b4901e1f46e88f85f23e101f6f7"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.289049 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6b09e5dbb4c8aab4eda986abb486dd13b545b4901e1f46e88f85f23e101f6f7" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.289110 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bhns8" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.307984 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f9c8dcd5-hbd9l" event={"ID":"c7517345-0440-461c-a78d-a29ef04ecf9c","Type":"ContainerStarted","Data":"7734fca46d44aab236a7525cae5cf202737b44231230c19379f7909912df7f78"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.308236 4903 scope.go:117] "RemoveContainer" containerID="d966108640f063665f16e98af21a0ac94b1fa731a7be38a133582d35df604846" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.324999 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerStarted","Data":"759eb993b2e2888d698736bc62374d3e38fede53838e4781051f8547ae55ceca"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.326147 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.329624 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerStarted","Data":"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332247 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-internal-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332313 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-public-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332355 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twp86\" (UniqueName: \"kubernetes.io/projected/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-kube-api-access-twp86\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332387 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-credential-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332443 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-combined-ca-bundle\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332512 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-scripts\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332577 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-config-data\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.332620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-fernet-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.343207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b000454-0ec3-4f51-ba7a-767530eaf03c","Type":"ContainerStarted","Data":"d35151765724e30090caf661612be838e1f8eee79915da98522e214d38079b6e"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.353231 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"a9185c9636564edafae2cc91e3e2f22615a33bfde3b5cf115f5f262e3f5aae83"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.362569 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv6h2" event={"ID":"98900f75-26e7-46cb-a70e-537fa0486fe8","Type":"ContainerStarted","Data":"466c9dfeffe342ed9c339eb847516a6e79456aa45aa2732dc102bda92052c254"} Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.369302 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77d59fbfc6-p8t8m" podStartSLOduration=13.369274573 podStartE2EDuration="13.369274573s" podCreationTimestamp="2025-12-02 23:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:10.355015618 +0000 UTC m=+1169.063569921" watchObservedRunningTime="2025-12-02 23:17:10.369274573 +0000 UTC m=+1169.077828856" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.412164 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=22.506772378 podStartE2EDuration="32.412134711s" podCreationTimestamp="2025-12-02 23:16:38 +0000 UTC" firstStartedPulling="2025-12-02 23:16:58.869499719 +0000 UTC m=+1157.578054002" lastFinishedPulling="2025-12-02 23:17:08.774862052 +0000 UTC m=+1167.483416335" observedRunningTime="2025-12-02 23:17:10.392062668 +0000 UTC m=+1169.100616951" watchObservedRunningTime="2025-12-02 23:17:10.412134711 +0000 UTC m=+1169.120688984" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.422430 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=22.63780492 podStartE2EDuration="32.422413252s" podCreationTimestamp="2025-12-02 23:16:38 +0000 UTC" firstStartedPulling="2025-12-02 23:16:58.997616184 +0000 UTC m=+1157.706170467" lastFinishedPulling="2025-12-02 23:17:08.782224516 +0000 UTC m=+1167.490778799" observedRunningTime="2025-12-02 23:17:10.419302339 +0000 UTC m=+1169.127856622" watchObservedRunningTime="2025-12-02 23:17:10.422413252 +0000 UTC m=+1169.130967535" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.453334 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tv6h2" podStartSLOduration=3.6922657709999998 podStartE2EDuration="55.453315338s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="2025-12-02 23:16:17.214778969 +0000 UTC m=+1115.923333242" lastFinishedPulling="2025-12-02 23:17:08.975828526 +0000 UTC m=+1167.684382809" observedRunningTime="2025-12-02 23:17:10.430905291 +0000 UTC m=+1169.139459574" watchObservedRunningTime="2025-12-02 23:17:10.453315338 +0000 UTC m=+1169.161869621" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-scripts\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456262 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-config-data\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456330 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-fernet-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456464 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-internal-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-public-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.456620 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twp86\" (UniqueName: \"kubernetes.io/projected/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-kube-api-access-twp86\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.457045 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-credential-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.457155 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-combined-ca-bundle\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.469045 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-credential-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.471381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-combined-ca-bundle\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.474813 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-public-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.487874 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-config-data\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: W1202 23:17:10.523206 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b492cef_e99c_4d41_a42b_7377908b5eed.slice/crio-050b422d7dc529b7cbc2378a9e236159c13a367d054a7ed553c85cb7e1aaf5ca WatchSource:0}: Error finding container 050b422d7dc529b7cbc2378a9e236159c13a367d054a7ed553c85cb7e1aaf5ca: Status 404 returned error can't find the container with id 050b422d7dc529b7cbc2378a9e236159c13a367d054a7ed553c85cb7e1aaf5ca Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.523697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twp86\" (UniqueName: \"kubernetes.io/projected/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-kube-api-access-twp86\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.523825 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.523924 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-fernet-keys\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.524104 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-scripts\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.525471 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d147d6c4-c17d-4e73-b8a3-efd87eb47f76-internal-tls-certs\") pod \"keystone-74c5f59c6f-5gx9d\" (UID: \"d147d6c4-c17d-4e73-b8a3-efd87eb47f76\") " pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.557424 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6db6798dff-kcjng"] Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.568137 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-665fcbdbd4-lvt55"] Dec 02 23:17:10 crc kubenswrapper[4903]: I1202 23:17:10.594861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:10 crc kubenswrapper[4903]: E1202 23:17:10.597584 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b6a3cf1_1dc7_4f53_a8a5_354f1c83b6ba.slice\": RecentStats: unable to find data in memory cache]" Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.210112 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74c5f59c6f-5gx9d"] Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.388845 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f9c8dcd5-hbd9l" event={"ID":"c7517345-0440-461c-a78d-a29ef04ecf9c","Type":"ContainerStarted","Data":"ebef9e0865601dbbede1d3ab47dd505755e643bbb25754b10064709599a8de94"} Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.393938 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-665fcbdbd4-lvt55" event={"ID":"4b492cef-e99c-4d41-a42b-7377908b5eed","Type":"ContainerStarted","Data":"8bcb397947395ac82d178c3fc0996377394dbde3a190482d809626e9420945c1"} Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.393981 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-665fcbdbd4-lvt55" event={"ID":"4b492cef-e99c-4d41-a42b-7377908b5eed","Type":"ContainerStarted","Data":"050b422d7dc529b7cbc2378a9e236159c13a367d054a7ed553c85cb7e1aaf5ca"} Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.395212 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74c5f59c6f-5gx9d" event={"ID":"d147d6c4-c17d-4e73-b8a3-efd87eb47f76","Type":"ContainerStarted","Data":"a2aedae5d8f4b6c4bff6b0508c55305052fb46661376571e80d24ab74312dad4"} Dec 02 23:17:11 crc kubenswrapper[4903]: I1202 23:17:11.635347 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba" path="/var/lib/kubelet/pods/2b6a3cf1-1dc7-4f53-a8a5-354f1c83b6ba/volumes" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.427907 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f9c8dcd5-hbd9l" event={"ID":"c7517345-0440-461c-a78d-a29ef04ecf9c","Type":"ContainerStarted","Data":"9f7c966a265206e266f55fbf3918cce2230132c6fd621ac845bbbaccf6df7d59"} Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.428295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.431011 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-665fcbdbd4-lvt55" event={"ID":"4b492cef-e99c-4d41-a42b-7377908b5eed","Type":"ContainerStarted","Data":"79e2d84c37a48366cf4b9b02da8125fe82b9964ab3f27ef25c9905c625241ea1"} Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.431685 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.431714 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.440768 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74c5f59c6f-5gx9d" event={"ID":"d147d6c4-c17d-4e73-b8a3-efd87eb47f76","Type":"ContainerStarted","Data":"5ff447f9e8212ed952de7aefeee1580e8cc966d56127788f5a2d84a500532cd7"} Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.458959 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56f9c8dcd5-hbd9l" podStartSLOduration=12.458939823 podStartE2EDuration="12.458939823s" podCreationTimestamp="2025-12-02 23:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:12.456211919 +0000 UTC m=+1171.164766192" watchObservedRunningTime="2025-12-02 23:17:12.458939823 +0000 UTC m=+1171.167494106" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.476713 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-74c5f59c6f-5gx9d" podStartSLOduration=2.47669503 podStartE2EDuration="2.47669503s" podCreationTimestamp="2025-12-02 23:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:12.474229661 +0000 UTC m=+1171.182783944" watchObservedRunningTime="2025-12-02 23:17:12.47669503 +0000 UTC m=+1171.185249313" Dec 02 23:17:12 crc kubenswrapper[4903]: I1202 23:17:12.502440 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-665fcbdbd4-lvt55" podStartSLOduration=3.502426565 podStartE2EDuration="3.502426565s" podCreationTimestamp="2025-12-02 23:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:12.496757862 +0000 UTC m=+1171.205312145" watchObservedRunningTime="2025-12-02 23:17:12.502426565 +0000 UTC m=+1171.210980838" Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.284448 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.284895 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" containerID="cri-o://ff4cd682768dc12d2eded8b67cc51f76bfcf5cb8ca4446a9de2565286d75ea15" gracePeriod=30 Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.285018 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" containerID="cri-o://86ae26bea8a462ef5ecd692e761bc008aa6ef25d86a1531f8bb23e4072c22cf2" gracePeriod=30 Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.457759 4903 generic.go:334] "Generic (PLEG): container finished" podID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerID="ff4cd682768dc12d2eded8b67cc51f76bfcf5cb8ca4446a9de2565286d75ea15" exitCode=143 Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.457839 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerDied","Data":"ff4cd682768dc12d2eded8b67cc51f76bfcf5cb8ca4446a9de2565286d75ea15"} Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.460026 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nsw82" event={"ID":"ecdb8e0b-8b04-4dc5-b532-8e68e8206122","Type":"ContainerStarted","Data":"6cdff083596655b0abf2505b6ad88953f08c3bb522280a97d7f3c46d2298e096"} Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.460746 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:13 crc kubenswrapper[4903]: I1202 23:17:13.474171 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nsw82" podStartSLOduration=3.806284133 podStartE2EDuration="58.474153152s" podCreationTimestamp="2025-12-02 23:16:15 +0000 UTC" firstStartedPulling="2025-12-02 23:16:17.208882842 +0000 UTC m=+1115.917437125" lastFinishedPulling="2025-12-02 23:17:11.876751831 +0000 UTC m=+1170.585306144" observedRunningTime="2025-12-02 23:17:13.473180648 +0000 UTC m=+1172.181734931" watchObservedRunningTime="2025-12-02 23:17:13.474153152 +0000 UTC m=+1172.182707435" Dec 02 23:17:14 crc kubenswrapper[4903]: I1202 23:17:14.086290 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.184096 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": read tcp 10.217.0.2:53278->10.217.0.159:9322: read: connection reset by peer" Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.184176 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": read tcp 10.217.0.2:53286->10.217.0.159:9322: read: connection reset by peer" Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.482172 4903 generic.go:334] "Generic (PLEG): container finished" podID="98900f75-26e7-46cb-a70e-537fa0486fe8" containerID="466c9dfeffe342ed9c339eb847516a6e79456aa45aa2732dc102bda92052c254" exitCode=0 Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.482239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv6h2" event={"ID":"98900f75-26e7-46cb-a70e-537fa0486fe8","Type":"ContainerDied","Data":"466c9dfeffe342ed9c339eb847516a6e79456aa45aa2732dc102bda92052c254"} Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.484474 4903 generic.go:334] "Generic (PLEG): container finished" podID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerID="86ae26bea8a462ef5ecd692e761bc008aa6ef25d86a1531f8bb23e4072c22cf2" exitCode=0 Dec 02 23:17:15 crc kubenswrapper[4903]: I1202 23:17:15.484530 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerDied","Data":"86ae26bea8a462ef5ecd692e761bc008aa6ef25d86a1531f8bb23e4072c22cf2"} Dec 02 23:17:16 crc kubenswrapper[4903]: I1202 23:17:16.496712 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerID="a9185c9636564edafae2cc91e3e2f22615a33bfde3b5cf115f5f262e3f5aae83" exitCode=1 Dec 02 23:17:16 crc kubenswrapper[4903]: I1202 23:17:16.497083 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"a9185c9636564edafae2cc91e3e2f22615a33bfde3b5cf115f5f262e3f5aae83"} Dec 02 23:17:16 crc kubenswrapper[4903]: I1202 23:17:16.497726 4903 scope.go:117] "RemoveContainer" containerID="a9185c9636564edafae2cc91e3e2f22615a33bfde3b5cf115f5f262e3f5aae83" Dec 02 23:17:17 crc kubenswrapper[4903]: I1202 23:17:17.263778 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:17 crc kubenswrapper[4903]: I1202 23:17:17.284785 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:17:18 crc kubenswrapper[4903]: I1202 23:17:18.915313 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": dial tcp 10.217.0.159:9322: connect: connection refused" Dec 02 23:17:18 crc kubenswrapper[4903]: I1202 23:17:18.915442 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.159:9322/\": dial tcp 10.217.0.159:9322: connect: connection refused" Dec 02 23:17:18 crc kubenswrapper[4903]: I1202 23:17:18.966868 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:18 crc kubenswrapper[4903]: I1202 23:17:18.966920 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.086207 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.126287 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.208363 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7857f5d94d-4lclz" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.306131 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.306388 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon-log" containerID="cri-o://ec86184f533730ef740f956a3a55cdef5876deb56101395eba447b01c1c43ce9" gracePeriod=30 Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.307079 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" containerID="cri-o://9bc1786dafda95c7a91b9d71ba47a1486b6b698cb2827edeca4b558247096a06" gracePeriod=30 Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.310548 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.316286 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 02 23:17:19 crc kubenswrapper[4903]: I1202 23:17:19.554401 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.511341 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.565869 4903 generic.go:334] "Generic (PLEG): container finished" podID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerID="9bc1786dafda95c7a91b9d71ba47a1486b6b698cb2827edeca4b558247096a06" exitCode=0 Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.565968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerDied","Data":"9bc1786dafda95c7a91b9d71ba47a1486b6b698cb2827edeca4b558247096a06"} Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.567492 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv6h2" event={"ID":"98900f75-26e7-46cb-a70e-537fa0486fe8","Type":"ContainerDied","Data":"513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59"} Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.567532 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="513fd57ec16f6d43b48da136d487108de5995d5d00fed3e9babd1888a553ad59" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.567555 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv6h2" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.625899 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle\") pod \"98900f75-26e7-46cb-a70e-537fa0486fe8\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.626022 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data\") pod \"98900f75-26e7-46cb-a70e-537fa0486fe8\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.626282 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzbg4\" (UniqueName: \"kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4\") pod \"98900f75-26e7-46cb-a70e-537fa0486fe8\" (UID: \"98900f75-26e7-46cb-a70e-537fa0486fe8\") " Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.636399 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "98900f75-26e7-46cb-a70e-537fa0486fe8" (UID: "98900f75-26e7-46cb-a70e-537fa0486fe8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.662207 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98900f75-26e7-46cb-a70e-537fa0486fe8" (UID: "98900f75-26e7-46cb-a70e-537fa0486fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.668933 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4" (OuterVolumeSpecName: "kube-api-access-pzbg4") pod "98900f75-26e7-46cb-a70e-537fa0486fe8" (UID: "98900f75-26e7-46cb-a70e-537fa0486fe8"). InnerVolumeSpecName "kube-api-access-pzbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.728829 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzbg4\" (UniqueName: \"kubernetes.io/projected/98900f75-26e7-46cb-a70e-537fa0486fe8-kube-api-access-pzbg4\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.728881 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:20 crc kubenswrapper[4903]: I1202 23:17:20.728890 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98900f75-26e7-46cb-a70e-537fa0486fe8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:20 crc kubenswrapper[4903]: E1202 23:17:20.888464 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fdf589e_17a2_4b53_b68c_f90e884b0080.slice/crio-1bce8edcc376cc44be897d0c5ea65e04b8d457cc31894966db288539082aa040.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.142212 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.238797 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data\") pod \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.238907 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjlst\" (UniqueName: \"kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst\") pod \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.238964 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle\") pod \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.239104 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs\") pod \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.239145 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca\") pod \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\" (UID: \"a0c7396b-f60c-437a-ad2d-f6c19b7c4570\") " Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.240024 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs" (OuterVolumeSpecName: "logs") pod "a0c7396b-f60c-437a-ad2d-f6c19b7c4570" (UID: "a0c7396b-f60c-437a-ad2d-f6c19b7c4570"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.243779 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst" (OuterVolumeSpecName: "kube-api-access-tjlst") pod "a0c7396b-f60c-437a-ad2d-f6c19b7c4570" (UID: "a0c7396b-f60c-437a-ad2d-f6c19b7c4570"). InnerVolumeSpecName "kube-api-access-tjlst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.282791 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0c7396b-f60c-437a-ad2d-f6c19b7c4570" (UID: "a0c7396b-f60c-437a-ad2d-f6c19b7c4570"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.286756 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a0c7396b-f60c-437a-ad2d-f6c19b7c4570" (UID: "a0c7396b-f60c-437a-ad2d-f6c19b7c4570"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.325090 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data" (OuterVolumeSpecName: "config-data") pod "a0c7396b-f60c-437a-ad2d-f6c19b7c4570" (UID: "a0c7396b-f60c-437a-ad2d-f6c19b7c4570"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.341866 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.341905 4903 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.341947 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.341957 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjlst\" (UniqueName: \"kubernetes.io/projected/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-kube-api-access-tjlst\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.345111 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7396b-f60c-437a-ad2d-f6c19b7c4570-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:21 crc kubenswrapper[4903]: E1202 23:17:21.443889 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1a6cd769-825e-4700-a66b-87291af7f897" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.629385 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec"} Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.688928 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0c7396b-f60c-437a-ad2d-f6c19b7c4570","Type":"ContainerDied","Data":"680c09ac16d87abb0e84624fa6345d54da2d24734ea59c4bd4504fbf7be2e2dc"} Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.688978 4903 scope.go:117] "RemoveContainer" containerID="86ae26bea8a462ef5ecd692e761bc008aa6ef25d86a1531f8bb23e4072c22cf2" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.689096 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.730923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerStarted","Data":"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e"} Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.731107 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="sg-core" containerID="cri-o://6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" gracePeriod=30 Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.731334 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.731390 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="proxy-httpd" containerID="cri-o://d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" gracePeriod=30 Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.747838 4903 scope.go:117] "RemoveContainer" containerID="ff4cd682768dc12d2eded8b67cc51f76bfcf5cb8ca4446a9de2565286d75ea15" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.758750 4903 generic.go:334] "Generic (PLEG): container finished" podID="5fdf589e-17a2-4b53-b68c-f90e884b0080" containerID="1bce8edcc376cc44be897d0c5ea65e04b8d457cc31894966db288539082aa040" exitCode=0 Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.758816 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-flhtr" event={"ID":"5fdf589e-17a2-4b53-b68c-f90e884b0080","Type":"ContainerDied","Data":"1bce8edcc376cc44be897d0c5ea65e04b8d457cc31894966db288539082aa040"} Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.761736 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.792867 4903 generic.go:334] "Generic (PLEG): container finished" podID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" containerID="6cdff083596655b0abf2505b6ad88953f08c3bb522280a97d7f3c46d2298e096" exitCode=0 Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.792909 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nsw82" event={"ID":"ecdb8e0b-8b04-4dc5-b532-8e68e8206122","Type":"ContainerDied","Data":"6cdff083596655b0abf2505b6ad88953f08c3bb522280a97d7f3c46d2298e096"} Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.794859 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.811320 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:21 crc kubenswrapper[4903]: E1202 23:17:21.811877 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.811891 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" Dec 02 23:17:21 crc kubenswrapper[4903]: E1202 23:17:21.811924 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" containerName="barbican-db-sync" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.811931 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" containerName="barbican-db-sync" Dec 02 23:17:21 crc kubenswrapper[4903]: E1202 23:17:21.811944 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.811949 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.812123 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api-log" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.812146 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" containerName="watcher-api" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.812155 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" containerName="barbican-db-sync" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.815836 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.819186 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.819347 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.821394 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.821526 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.943770 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65cf4c8457-6ff7v"] Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.965324 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.966342 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.966595 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9a701-de78-4dc2-b8a7-365cd41a5693-logs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.966887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v29pq\" (UniqueName: \"kubernetes.io/projected/57a9a701-de78-4dc2-b8a7-365cd41a5693-kube-api-access-v29pq\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.967018 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-config-data\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.967175 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-public-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.969755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.969989 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.977818 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.978557 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.978774 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f94dq" Dec 02 23:17:21 crc kubenswrapper[4903]: I1202 23:17:21.986947 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65cf4c8457-6ff7v"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.066356 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-9559fbfd6-k4fwk"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.067973 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.071878 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072046 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbn45\" (UniqueName: \"kubernetes.io/projected/6ea83627-fed8-458c-a39b-f73e682799d3-kube-api-access-zbn45\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072333 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data-custom\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072416 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072484 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9a701-de78-4dc2-b8a7-365cd41a5693-logs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072603 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v29pq\" (UniqueName: \"kubernetes.io/projected/57a9a701-de78-4dc2-b8a7-365cd41a5693-kube-api-access-v29pq\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072689 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-config-data\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072737 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072808 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea83627-fed8-458c-a39b-f73e682799d3-logs\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-public-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.072950 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.073287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.073388 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-combined-ca-bundle\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.073793 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a9a701-de78-4dc2-b8a7-365cd41a5693-logs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.082642 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-config-data\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.086010 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.089380 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-public-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.091052 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.091153 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/57a9a701-de78-4dc2-b8a7-365cd41a5693-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.099106 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v29pq\" (UniqueName: \"kubernetes.io/projected/57a9a701-de78-4dc2-b8a7-365cd41a5693-kube-api-access-v29pq\") pod \"watcher-api-0\" (UID: \"57a9a701-de78-4dc2-b8a7-365cd41a5693\") " pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.126912 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-9559fbfd6-k4fwk"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.160603 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.164621 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176231 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbn45\" (UniqueName: \"kubernetes.io/projected/6ea83627-fed8-458c-a39b-f73e682799d3-kube-api-access-zbn45\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176273 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data-custom\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176297 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176330 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-combined-ca-bundle\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176348 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zbfz\" (UniqueName: \"kubernetes.io/projected/c180d7c5-ad61-4190-b709-6efe6a9a2434-kube-api-access-6zbfz\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176394 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea83627-fed8-458c-a39b-f73e682799d3-logs\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176432 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176469 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data-custom\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176492 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-combined-ca-bundle\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.176506 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180d7c5-ad61-4190-b709-6efe6a9a2434-logs\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.178901 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea83627-fed8-458c-a39b-f73e682799d3-logs\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.184816 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data-custom\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.195297 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-combined-ca-bundle\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.196677 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea83627-fed8-458c-a39b-f73e682799d3-config-data\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.200608 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.209395 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbn45\" (UniqueName: \"kubernetes.io/projected/6ea83627-fed8-458c-a39b-f73e682799d3-kube-api-access-zbn45\") pod \"barbican-worker-65cf4c8457-6ff7v\" (UID: \"6ea83627-fed8-458c-a39b-f73e682799d3\") " pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.212162 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.234919 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.236644 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.238520 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.279856 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280132 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280149 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280179 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280213 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data-custom\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280251 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180d7c5-ad61-4190-b709-6efe6a9a2434-logs\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280279 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpkr\" (UniqueName: \"kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280339 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280370 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-combined-ca-bundle\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280387 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zbfz\" (UniqueName: \"kubernetes.io/projected/c180d7c5-ad61-4190-b709-6efe6a9a2434-kube-api-access-6zbfz\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.280423 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.281845 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.282521 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c180d7c5-ad61-4190-b709-6efe6a9a2434-logs\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.288882 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.291440 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-combined-ca-bundle\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.291908 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c180d7c5-ad61-4190-b709-6efe6a9a2434-config-data-custom\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.298109 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65cf4c8457-6ff7v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.303574 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zbfz\" (UniqueName: \"kubernetes.io/projected/c180d7c5-ad61-4190-b709-6efe6a9a2434-kube-api-access-6zbfz\") pod \"barbican-keystone-listener-9559fbfd6-k4fwk\" (UID: \"c180d7c5-ad61-4190-b709-6efe6a9a2434\") " pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.318903 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.382841 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.382887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.382943 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.382983 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383002 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383019 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383037 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383062 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383101 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msqr7\" (UniqueName: \"kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383127 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.383151 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpkr\" (UniqueName: \"kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.384610 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.385136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.385609 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.386432 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.387176 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.423343 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpkr\" (UniqueName: \"kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr\") pod \"dnsmasq-dns-599797ccb9-gvj5v\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.499212 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.499374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msqr7\" (UniqueName: \"kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.499421 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.499553 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.499609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.500806 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.506106 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.519161 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msqr7\" (UniqueName: \"kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.520793 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.535021 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data\") pod \"barbican-api-86bbdbcfd-94wnf\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.633190 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.644359 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.815773 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833603 4903 generic.go:334] "Generic (PLEG): container finished" podID="1a6cd769-825e-4700-a66b-87291af7f897" containerID="d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" exitCode=0 Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833636 4903 generic.go:334] "Generic (PLEG): container finished" podID="1a6cd769-825e-4700-a66b-87291af7f897" containerID="6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" exitCode=2 Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833701 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerDied","Data":"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e"} Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerDied","Data":"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8"} Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833748 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a6cd769-825e-4700-a66b-87291af7f897","Type":"ContainerDied","Data":"af95a279857dfebc01610e9c9a62a286492fe617a703f7e026a40e70bee45b01"} Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833766 4903 scope.go:117] "RemoveContainer" containerID="d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.833921 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.859817 4903 scope.go:117] "RemoveContainer" containerID="6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.891790 4903 scope.go:117] "RemoveContainer" containerID="d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" Dec 02 23:17:22 crc kubenswrapper[4903]: E1202 23:17:22.896614 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e\": container with ID starting with d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e not found: ID does not exist" containerID="d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.896658 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e"} err="failed to get container status \"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e\": rpc error: code = NotFound desc = could not find container \"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e\": container with ID starting with d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e not found: ID does not exist" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.896679 4903 scope.go:117] "RemoveContainer" containerID="6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" Dec 02 23:17:22 crc kubenswrapper[4903]: E1202 23:17:22.896948 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8\": container with ID starting with 6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8 not found: ID does not exist" containerID="6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.896967 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8"} err="failed to get container status \"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8\": rpc error: code = NotFound desc = could not find container \"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8\": container with ID starting with 6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8 not found: ID does not exist" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.896981 4903 scope.go:117] "RemoveContainer" containerID="d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.897195 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e"} err="failed to get container status \"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e\": rpc error: code = NotFound desc = could not find container \"d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e\": container with ID starting with d7b40ae1ffb6fee794791193bc19605db9253a28869cbb77f58e1491d7d1083e not found: ID does not exist" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.897210 4903 scope.go:117] "RemoveContainer" containerID="6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.897414 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8"} err="failed to get container status \"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8\": rpc error: code = NotFound desc = could not find container \"6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8\": container with ID starting with 6412d768782b300da792588f987004f79d707c9b887234d2dc54944772d5f6b8 not found: ID does not exist" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.906757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.906820 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.906862 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.906896 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.906949 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.907015 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.907071 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9nmt\" (UniqueName: \"kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt\") pod \"1a6cd769-825e-4700-a66b-87291af7f897\" (UID: \"1a6cd769-825e-4700-a66b-87291af7f897\") " Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.908127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.908445 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.911799 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt" (OuterVolumeSpecName: "kube-api-access-l9nmt") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "kube-api-access-l9nmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.912576 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts" (OuterVolumeSpecName: "scripts") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.945805 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.949565 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.959631 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:22 crc kubenswrapper[4903]: I1202 23:17:22.995265 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data" (OuterVolumeSpecName: "config-data") pod "1a6cd769-825e-4700-a66b-87291af7f897" (UID: "1a6cd769-825e-4700-a66b-87291af7f897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009052 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009083 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009094 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009102 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009110 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd769-825e-4700-a66b-87291af7f897-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009120 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd769-825e-4700-a66b-87291af7f897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.009130 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9nmt\" (UniqueName: \"kubernetes.io/projected/1a6cd769-825e-4700-a66b-87291af7f897-kube-api-access-l9nmt\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.044053 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65cf4c8457-6ff7v"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.054499 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-9559fbfd6-k4fwk"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.198824 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.208597 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.224874 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:23 crc kubenswrapper[4903]: E1202 23:17:23.225226 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="proxy-httpd" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.225237 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="proxy-httpd" Dec 02 23:17:23 crc kubenswrapper[4903]: E1202 23:17:23.225271 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="sg-core" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.225277 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="sg-core" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.225437 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="proxy-httpd" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.225457 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6cd769-825e-4700-a66b-87291af7f897" containerName="sg-core" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.229505 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.233796 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.233966 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.253215 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326536 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6fc\" (UniqueName: \"kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326726 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326871 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326906 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.326960 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.327048 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.432926 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.432984 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.433020 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.433054 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.433090 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.435202 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.435314 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6fc\" (UniqueName: \"kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.436239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.436296 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.447598 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.450555 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.451840 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.465123 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6fc\" (UniqueName: \"kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.477484 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.481425 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.522485 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.562553 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.627069 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6cd769-825e-4700-a66b-87291af7f897" path="/var/lib/kubelet/pods/1a6cd769-825e-4700-a66b-87291af7f897/volumes" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.627734 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c7396b-f60c-437a-ad2d-f6c19b7c4570" path="/var/lib/kubelet/pods/a0c7396b-f60c-437a-ad2d-f6c19b7c4570/volumes" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.720327 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-flhtr" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.757490 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nsw82" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842703 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle\") pod \"5fdf589e-17a2-4b53-b68c-f90e884b0080\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842752 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842777 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842867 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mclrt\" (UniqueName: \"kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt\") pod \"5fdf589e-17a2-4b53-b68c-f90e884b0080\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842934 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctbjx\" (UniqueName: \"kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.842993 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data\") pod \"5fdf589e-17a2-4b53-b68c-f90e884b0080\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.843041 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.843065 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data\") pod \"5fdf589e-17a2-4b53-b68c-f90e884b0080\" (UID: \"5fdf589e-17a2-4b53-b68c-f90e884b0080\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.843113 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.843188 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data\") pod \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\" (UID: \"ecdb8e0b-8b04-4dc5-b532-8e68e8206122\") " Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.851791 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.869919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5fdf589e-17a2-4b53-b68c-f90e884b0080" (UID: "5fdf589e-17a2-4b53-b68c-f90e884b0080"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.870326 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.878971 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx" (OuterVolumeSpecName: "kube-api-access-ctbjx") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "kube-api-access-ctbjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.887607 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65cf4c8457-6ff7v" event={"ID":"6ea83627-fed8-458c-a39b-f73e682799d3","Type":"ContainerStarted","Data":"d93d62e92e521ae472fd48863ffbb775ce5c13d7f5320bdaa00d15373af442cd"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.892293 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" event={"ID":"c180d7c5-ad61-4190-b709-6efe6a9a2434","Type":"ContainerStarted","Data":"45553e355e616fbde1a8532e46ea35bfdb6598bb02d44d1b33acadcb179147ef"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.899023 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-flhtr" event={"ID":"5fdf589e-17a2-4b53-b68c-f90e884b0080","Type":"ContainerDied","Data":"d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.899069 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82c740942108715eb9de6acb2fc514b8fb7b59445145895894c094acf733e95" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.899129 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-flhtr" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.904156 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts" (OuterVolumeSpecName: "scripts") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.911279 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt" (OuterVolumeSpecName: "kube-api-access-mclrt") pod "5fdf589e-17a2-4b53-b68c-f90e884b0080" (UID: "5fdf589e-17a2-4b53-b68c-f90e884b0080"). InnerVolumeSpecName "kube-api-access-mclrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.914941 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nsw82" event={"ID":"ecdb8e0b-8b04-4dc5-b532-8e68e8206122","Type":"ContainerDied","Data":"a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.914980 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31c12eb0abbe5bc90138c9c3ce6ecd8e3dbcc71f6f7c7ec46d35df8c0ce1c3f" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.914949 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nsw82" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.922862 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"57a9a701-de78-4dc2-b8a7-365cd41a5693","Type":"ContainerStarted","Data":"a6f1f9ea4790955e8f3a5bc7b22a779a6721003774aab27f29b0f917186ed503"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.922921 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"57a9a701-de78-4dc2-b8a7-365cd41a5693","Type":"ContainerStarted","Data":"137442cc39a9b3fa2b4b9ffd51c3871f3d0b4c54206c34b4e3bb1e9e8f381638"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.927352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerStarted","Data":"6ba8e20791a0ae9152c8f20f7c8fc9681a9d41da237dc1b7e3439ed320afbc9a"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.928707 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" event={"ID":"3aae219f-2594-44f2-9f58-2b3149e8edbb","Type":"ContainerStarted","Data":"03f753b0f7eb11325b001e25fb351f70bc85e5e663bd9f94e6f8215b70a4fee1"} Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951549 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mclrt\" (UniqueName: \"kubernetes.io/projected/5fdf589e-17a2-4b53-b68c-f90e884b0080-kube-api-access-mclrt\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951586 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctbjx\" (UniqueName: \"kubernetes.io/projected/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-kube-api-access-ctbjx\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951595 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951603 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951614 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.951622 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:23 crc kubenswrapper[4903]: I1202 23:17:23.981217 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fdf589e-17a2-4b53-b68c-f90e884b0080" (UID: "5fdf589e-17a2-4b53-b68c-f90e884b0080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.018192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.053034 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.053079 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.079801 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data" (OuterVolumeSpecName: "config-data") pod "ecdb8e0b-8b04-4dc5-b532-8e68e8206122" (UID: "ecdb8e0b-8b04-4dc5-b532-8e68e8206122"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.114916 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data" (OuterVolumeSpecName: "config-data") pod "5fdf589e-17a2-4b53-b68c-f90e884b0080" (UID: "5fdf589e-17a2-4b53-b68c-f90e884b0080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.152486 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.154957 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdf589e-17a2-4b53-b68c-f90e884b0080-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.155002 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdb8e0b-8b04-4dc5-b532-8e68e8206122-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.402741 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.453703 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:24 crc kubenswrapper[4903]: E1202 23:17:24.454099 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" containerName="glance-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.454115 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" containerName="glance-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: E1202 23:17:24.454150 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" containerName="cinder-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.454158 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" containerName="cinder-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.454331 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" containerName="cinder-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.454346 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" containerName="glance-db-sync" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.455264 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.460859 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84cbc79f6c-kc9ss"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.462367 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.465885 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pdxtg" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.466027 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.466125 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.466248 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.474910 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.487493 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84cbc79f6c-kc9ss"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571435 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrns8\" (UniqueName: \"kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571535 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571580 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571609 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571674 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571696 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571726 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6v7v\" (UniqueName: \"kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571780 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571820 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571839 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571862 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.571896 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.622115 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84cbc79f6c-kc9ss"] Dec 02 23:17:24 crc kubenswrapper[4903]: E1202 23:17:24.626895 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-t6v7v ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" podUID="69a45323-9438-46ad-928b-6ce05cfe3b9d" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.654536 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.656734 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.674970 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675007 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675033 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675068 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675089 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrns8\" (UniqueName: \"kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675132 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675167 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675186 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675225 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675247 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675267 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6v7v\" (UniqueName: \"kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675312 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.675797 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.676280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.676501 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.677086 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.677607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.678076 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.688022 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.690859 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.692815 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.696120 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.708412 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.744053 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6v7v\" (UniqueName: \"kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v\") pod \"dnsmasq-dns-84cbc79f6c-kc9ss\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.755741 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrns8\" (UniqueName: \"kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8\") pod \"cinder-scheduler-0\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.779957 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbpz\" (UniqueName: \"kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.780020 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.780132 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.780201 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.780219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.780304 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.835755 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.837540 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.842697 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.859017 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881417 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881457 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881526 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881596 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbpz\" (UniqueName: \"kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881622 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.881684 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.882702 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.882814 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.882828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.882912 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.883148 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.884081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.914407 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbpz\" (UniqueName: \"kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz\") pod \"dnsmasq-dns-6ffc74bd45-2xt7m\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.946907 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerStarted","Data":"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173"} Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.950743 4903 generic.go:334] "Generic (PLEG): container finished" podID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerID="07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44" exitCode=0 Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.950791 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" event={"ID":"3aae219f-2594-44f2-9f58-2b3149e8edbb","Type":"ContainerDied","Data":"07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44"} Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.977185 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.977669 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"57a9a701-de78-4dc2-b8a7-365cd41a5693","Type":"ContainerStarted","Data":"0453471d4a929a8d9876c77e926dcd3a4de17cf4a5f703adfa3c6440918a63d4"} Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.978193 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.983768 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="57a9a701-de78-4dc2-b8a7-365cd41a5693" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.168:9322/\": dial tcp 10.217.0.168:9322: connect: connection refused" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.984854 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.984907 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzm6g\" (UniqueName: \"kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.984956 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.985009 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.985066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.985095 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.985186 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.986129 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Dec 02 23:17:24 crc kubenswrapper[4903]: I1202 23:17:24.999001 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.998983689 podStartE2EDuration="3.998983689s" podCreationTimestamp="2025-12-02 23:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:24.996150462 +0000 UTC m=+1183.704704765" watchObservedRunningTime="2025-12-02 23:17:24.998983689 +0000 UTC m=+1183.707537972" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.009768 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.086580 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.086706 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.086734 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6v7v\" (UniqueName: \"kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.086861 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.086966 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.087042 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config\") pod \"69a45323-9438-46ad-928b-6ce05cfe3b9d\" (UID: \"69a45323-9438-46ad-928b-6ce05cfe3b9d\") " Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.087511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.088681 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.088956 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.089504 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.089744 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config" (OuterVolumeSpecName: "config") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.090316 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.090802 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.096011 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v" (OuterVolumeSpecName: "kube-api-access-t6v7v") pod "69a45323-9438-46ad-928b-6ce05cfe3b9d" (UID: "69a45323-9438-46ad-928b-6ce05cfe3b9d"). InnerVolumeSpecName "kube-api-access-t6v7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.096329 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.101680 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.102048 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.102145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.102791 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.106178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.106237 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzm6g\" (UniqueName: \"kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.106611 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107086 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107149 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107186 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107200 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6v7v\" (UniqueName: \"kubernetes.io/projected/69a45323-9438-46ad-928b-6ce05cfe3b9d-kube-api-access-t6v7v\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107473 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107514 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.107526 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69a45323-9438-46ad-928b-6ce05cfe3b9d-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.108168 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.115325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.128534 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzm6g\" (UniqueName: \"kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g\") pod \"cinder-api-0\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.163234 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.188095 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.226733 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.228424 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.231539 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h4s5l" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.231784 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.231902 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.256336 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324055 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324190 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shngg\" (UniqueName: \"kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324230 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324269 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324302 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324351 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.324468 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.426739 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.426816 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.426900 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.426959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.426999 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shngg\" (UniqueName: \"kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.427043 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.427061 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.430961 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.431043 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.431350 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.437618 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.437669 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.441997 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.451510 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shngg\" (UniqueName: \"kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.466799 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.593298 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.780147 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.781780 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.787116 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.794788 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.936571 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.936719 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.937197 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.937595 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzrs\" (UniqueName: \"kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.937705 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.937764 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.937832 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:25 crc kubenswrapper[4903]: I1202 23:17:25.985759 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cbc79f6c-kc9ss" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.033479 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84cbc79f6c-kc9ss"] Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040070 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040165 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040251 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040400 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzrs\" (UniqueName: \"kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040437 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040472 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.040523 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.041117 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.041742 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.042231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.048445 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.049898 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84cbc79f6c-kc9ss"] Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.051832 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.052345 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.059963 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzrs\" (UniqueName: \"kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.083702 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.120910 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.682998 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:17:26 crc kubenswrapper[4903]: W1202 23:17:26.715140 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9ae404e_88a5_4e23_a7f8_e2e1198cfc27.slice/crio-9860acf549d083691ebe1ad6e3a08184125470eb8f1ed3335a55b825ddf99c13 WatchSource:0}: Error finding container 9860acf549d083691ebe1ad6e3a08184125470eb8f1ed3335a55b825ddf99c13: Status 404 returned error can't find the container with id 9860acf549d083691ebe1ad6e3a08184125470eb8f1ed3335a55b825ddf99c13 Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.925716 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:26 crc kubenswrapper[4903]: I1202 23:17:26.945958 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.009510 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" event={"ID":"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27","Type":"ContainerStarted","Data":"9860acf549d083691ebe1ad6e3a08184125470eb8f1ed3335a55b825ddf99c13"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.015794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerStarted","Data":"779059428bd547a3eafc17465247cbc45952ca7ca3e2507e08d1480a6ec5ff78"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.016388 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.022414 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerStarted","Data":"bb1e9b0b5dc665fd127a4253fbf40e3901c1b7ba5e2fb1fc09297cebbb951d47"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.030374 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerStarted","Data":"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.032216 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.032266 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.061643 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" event={"ID":"3aae219f-2594-44f2-9f58-2b3149e8edbb","Type":"ContainerStarted","Data":"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.061910 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="dnsmasq-dns" containerID="cri-o://c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4" gracePeriod=10 Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.061936 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.065437 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86bbdbcfd-94wnf" podStartSLOduration=5.065417382 podStartE2EDuration="5.065417382s" podCreationTimestamp="2025-12-02 23:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:27.052521279 +0000 UTC m=+1185.761075562" watchObservedRunningTime="2025-12-02 23:17:27.065417382 +0000 UTC m=+1185.773971665" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.073265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerStarted","Data":"9719688f6d232c8c0923e688ee0723c6eec00b5c07cd9a49faf0dbd99a9564b2"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.087090 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerID="df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec" exitCode=1 Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.087351 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec"} Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.087675 4903 scope.go:117] "RemoveContainer" containerID="a9185c9636564edafae2cc91e3e2f22615a33bfde3b5cf115f5f262e3f5aae83" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.095892 4903 scope.go:117] "RemoveContainer" containerID="df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec" Dec 02 23:17:27 crc kubenswrapper[4903]: E1202 23:17:27.096251 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.103406 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" podStartSLOduration=5.103385045 podStartE2EDuration="5.103385045s" podCreationTimestamp="2025-12-02 23:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:27.087450299 +0000 UTC m=+1185.796004572" watchObservedRunningTime="2025-12-02 23:17:27.103385045 +0000 UTC m=+1185.811939318" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.212932 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.266005 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.666029 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a45323-9438-46ad-928b-6ce05cfe3b9d" path="/var/lib/kubelet/pods/69a45323-9438-46ad-928b-6ce05cfe3b9d/volumes" Dec 02 23:17:27 crc kubenswrapper[4903]: I1202 23:17:27.868524 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012076 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012163 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012199 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012300 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012340 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpkr\" (UniqueName: \"kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.012359 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb\") pod \"3aae219f-2594-44f2-9f58-2b3149e8edbb\" (UID: \"3aae219f-2594-44f2-9f58-2b3149e8edbb\") " Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.037494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr" (OuterVolumeSpecName: "kube-api-access-6cpkr") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "kube-api-access-6cpkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.114199 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cpkr\" (UniqueName: \"kubernetes.io/projected/3aae219f-2594-44f2-9f58-2b3149e8edbb-kube-api-access-6cpkr\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.193280 4903 generic.go:334] "Generic (PLEG): container finished" podID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerID="7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f" exitCode=0 Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.193503 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" event={"ID":"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27","Type":"ContainerDied","Data":"7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.208357 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerStarted","Data":"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.217488 4903 generic.go:334] "Generic (PLEG): container finished" podID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerID="c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4" exitCode=0 Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.217557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" event={"ID":"3aae219f-2594-44f2-9f58-2b3149e8edbb","Type":"ContainerDied","Data":"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.217585 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" event={"ID":"3aae219f-2594-44f2-9f58-2b3149e8edbb","Type":"ContainerDied","Data":"03f753b0f7eb11325b001e25fb351f70bc85e5e663bd9f94e6f8215b70a4fee1"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.217601 4903 scope.go:117] "RemoveContainer" containerID="c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.217937 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599797ccb9-gvj5v" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.221817 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerStarted","Data":"781986d9e1b57fac36ed65f5e48411a57596f59d79920da9750ddaffb2dea1a9"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.224157 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerStarted","Data":"498d547e8f4f7c23385adae389a1612dbbcf63f2702330f41cab110b10b282d5"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.226699 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65cf4c8457-6ff7v" event={"ID":"6ea83627-fed8-458c-a39b-f73e682799d3","Type":"ContainerStarted","Data":"3b40d634c8616d2cd1b8dd15caf4dbd3cbe4d93e23ec8a54cbae35a6934ebf70"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.235239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" event={"ID":"c180d7c5-ad61-4190-b709-6efe6a9a2434","Type":"ContainerStarted","Data":"1acb4a8a8c9422c7aba9b504cdb3dc09e76c91cc2b78258f66be0a2b7d325abd"} Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.253460 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.259112 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65cf4c8457-6ff7v" podStartSLOduration=4.05651055 podStartE2EDuration="7.259086045s" podCreationTimestamp="2025-12-02 23:17:21 +0000 UTC" firstStartedPulling="2025-12-02 23:17:23.061845623 +0000 UTC m=+1181.770399906" lastFinishedPulling="2025-12-02 23:17:26.264421118 +0000 UTC m=+1184.972975401" observedRunningTime="2025-12-02 23:17:28.245753232 +0000 UTC m=+1186.954307515" watchObservedRunningTime="2025-12-02 23:17:28.259086045 +0000 UTC m=+1186.967640328" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.296248 4903 scope.go:117] "RemoveContainer" containerID="07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.354039 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" podStartSLOduration=4.229073695 podStartE2EDuration="7.354008995s" podCreationTimestamp="2025-12-02 23:17:21 +0000 UTC" firstStartedPulling="2025-12-02 23:17:23.110168909 +0000 UTC m=+1181.818723182" lastFinishedPulling="2025-12-02 23:17:26.235104209 +0000 UTC m=+1184.943658482" observedRunningTime="2025-12-02 23:17:28.302050574 +0000 UTC m=+1187.010604857" watchObservedRunningTime="2025-12-02 23:17:28.354008995 +0000 UTC m=+1187.062563278" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.374391 4903 scope.go:117] "RemoveContainer" containerID="c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4" Dec 02 23:17:28 crc kubenswrapper[4903]: E1202 23:17:28.439620 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4\": container with ID starting with c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4 not found: ID does not exist" containerID="c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.439679 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4"} err="failed to get container status \"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4\": rpc error: code = NotFound desc = could not find container \"c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4\": container with ID starting with c02ac9548211b7e04f40376bc0fa9d4c8b23cc06e015a759e3f05605a724e2a4 not found: ID does not exist" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.439704 4903 scope.go:117] "RemoveContainer" containerID="07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44" Dec 02 23:17:28 crc kubenswrapper[4903]: E1202 23:17:28.444054 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44\": container with ID starting with 07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44 not found: ID does not exist" containerID="07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.444098 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44"} err="failed to get container status \"07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44\": rpc error: code = NotFound desc = could not find container \"07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44\": container with ID starting with 07f1231c8c6eb86f8ebc21af7dcee27778c48877bc0b2d6b5eb2ba30f737ba44 not found: ID does not exist" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.569880 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.635934 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.728951 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config" (OuterVolumeSpecName: "config") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.740946 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.753356 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.810005 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.821237 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3aae219f-2594-44f2-9f58-2b3149e8edbb" (UID: "3aae219f-2594-44f2-9f58-2b3149e8edbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.844270 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.844309 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.844321 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aae219f-2594-44f2-9f58-2b3149e8edbb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.966295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.966459 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:28 crc kubenswrapper[4903]: I1202 23:17:28.967107 4903 scope.go:117] "RemoveContainer" containerID="df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec" Dec 02 23:17:28 crc kubenswrapper[4903]: E1202 23:17:28.967378 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.190792 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.201409 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599797ccb9-gvj5v"] Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.247044 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65cf4c8457-6ff7v" event={"ID":"6ea83627-fed8-458c-a39b-f73e682799d3","Type":"ContainerStarted","Data":"2a26394a33f18e9acd3306e15f2b14f123eed1667c79bf73a058bb53f7ca12f9"} Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.249479 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9559fbfd6-k4fwk" event={"ID":"c180d7c5-ad61-4190-b709-6efe6a9a2434","Type":"ContainerStarted","Data":"817ddb40cc6f9b2e25f952282c482879aa546cc111ce2a022358571a450a5b32"} Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.251769 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerStarted","Data":"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5"} Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.529794 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:17:29 crc kubenswrapper[4903]: I1202 23:17:29.622726 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" path="/var/lib/kubelet/pods/3aae219f-2594-44f2-9f58-2b3149e8edbb/volumes" Dec 02 23:17:30 crc kubenswrapper[4903]: I1202 23:17:30.057384 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:30 crc kubenswrapper[4903]: I1202 23:17:30.132915 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:30 crc kubenswrapper[4903]: I1202 23:17:30.986098 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.091056 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56f9c8dcd5-hbd9l" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.104910 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data\") pod \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.104963 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs\") pod \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.105012 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key\") pod \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.105139 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chsbq\" (UniqueName: \"kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq\") pod \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.105177 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts\") pod \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\" (UID: \"18dbf052-03f5-4a2c-a8a7-86740787c1dc\") " Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.106202 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs" (OuterVolumeSpecName: "logs") pod "18dbf052-03f5-4a2c-a8a7-86740787c1dc" (UID: "18dbf052-03f5-4a2c-a8a7-86740787c1dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.137829 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "18dbf052-03f5-4a2c-a8a7-86740787c1dc" (UID: "18dbf052-03f5-4a2c-a8a7-86740787c1dc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.162375 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.162586 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77d59fbfc6-p8t8m" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-api" containerID="cri-o://a436f02befb85b1592a3221ab644954bf4f4e6026430927df464c7bfa7562de5" gracePeriod=30 Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.162973 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77d59fbfc6-p8t8m" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-httpd" containerID="cri-o://759eb993b2e2888d698736bc62374d3e38fede53838e4781051f8547ae55ceca" gracePeriod=30 Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.170531 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq" (OuterVolumeSpecName: "kube-api-access-chsbq") pod "18dbf052-03f5-4a2c-a8a7-86740787c1dc" (UID: "18dbf052-03f5-4a2c-a8a7-86740787c1dc"). InnerVolumeSpecName "kube-api-access-chsbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.203337 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data" (OuterVolumeSpecName: "config-data") pod "18dbf052-03f5-4a2c-a8a7-86740787c1dc" (UID: "18dbf052-03f5-4a2c-a8a7-86740787c1dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.204000 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts" (OuterVolumeSpecName: "scripts") pod "18dbf052-03f5-4a2c-a8a7-86740787c1dc" (UID: "18dbf052-03f5-4a2c-a8a7-86740787c1dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.207675 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.207704 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18dbf052-03f5-4a2c-a8a7-86740787c1dc-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.207713 4903 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18dbf052-03f5-4a2c-a8a7-86740787c1dc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.207721 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chsbq\" (UniqueName: \"kubernetes.io/projected/18dbf052-03f5-4a2c-a8a7-86740787c1dc-kube-api-access-chsbq\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.207731 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18dbf052-03f5-4a2c-a8a7-86740787c1dc-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.345635 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerStarted","Data":"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.349209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerStarted","Data":"8b12b12b802caa2d79cb3d0053c5ca1d25e59074f9ceff0c4035111c88e1cb1b"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.352166 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerStarted","Data":"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354161 4903 generic.go:334] "Generic (PLEG): container finished" podID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerID="07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" exitCode=137 Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354185 4903 generic.go:334] "Generic (PLEG): container finished" podID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerID="ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" exitCode=137 Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354215 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerDied","Data":"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354231 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerDied","Data":"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354241 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c5c474cdf-n98r7" event={"ID":"18dbf052-03f5-4a2c-a8a7-86740787c1dc","Type":"ContainerDied","Data":"d35771fc60f5cbd68b6e15f21b8427c2dee9c78810c8e67bbee142c6e56f8e3b"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354254 4903 scope.go:117] "RemoveContainer" containerID="07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.354371 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c5c474cdf-n98r7" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.385968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerStarted","Data":"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.416386 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.419021 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" event={"ID":"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27","Type":"ContainerStarted","Data":"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.420369 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.426444 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c5c474cdf-n98r7"] Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.428101 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerStarted","Data":"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7"} Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.438846 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" podStartSLOduration=7.438800061 podStartE2EDuration="7.438800061s" podCreationTimestamp="2025-12-02 23:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:31.435841372 +0000 UTC m=+1190.144395645" watchObservedRunningTime="2025-12-02 23:17:31.438800061 +0000 UTC m=+1190.147354344" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.720716 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" path="/var/lib/kubelet/pods/18dbf052-03f5-4a2c-a8a7-86740787c1dc/volumes" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.720815 4903 scope.go:117] "RemoveContainer" containerID="ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.813372 4903 scope.go:117] "RemoveContainer" containerID="07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" Dec 02 23:17:31 crc kubenswrapper[4903]: E1202 23:17:31.815816 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28\": container with ID starting with 07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28 not found: ID does not exist" containerID="07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.815851 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28"} err="failed to get container status \"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28\": rpc error: code = NotFound desc = could not find container \"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28\": container with ID starting with 07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28 not found: ID does not exist" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.815872 4903 scope.go:117] "RemoveContainer" containerID="ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" Dec 02 23:17:31 crc kubenswrapper[4903]: E1202 23:17:31.817670 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc\": container with ID starting with ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc not found: ID does not exist" containerID="ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.817699 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc"} err="failed to get container status \"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc\": rpc error: code = NotFound desc = could not find container \"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc\": container with ID starting with ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc not found: ID does not exist" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.817714 4903 scope.go:117] "RemoveContainer" containerID="07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.820477 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28"} err="failed to get container status \"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28\": rpc error: code = NotFound desc = could not find container \"07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28\": container with ID starting with 07b1ce671b9a16281ea2162ffa9fd441a6948a1ceda4dc0ac09f36db36ad2e28 not found: ID does not exist" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.820500 4903 scope.go:117] "RemoveContainer" containerID="ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc" Dec 02 23:17:31 crc kubenswrapper[4903]: I1202 23:17:31.822753 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc"} err="failed to get container status \"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc\": rpc error: code = NotFound desc = could not find container \"ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc\": container with ID starting with ed8dfc1142c8a2f2878e51da66232e57612530c7a110c040c33b4747de93d8cc not found: ID does not exist" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.156974 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.213937 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.270366 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.335232 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.442502 4903 generic.go:334] "Generic (PLEG): container finished" podID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerID="759eb993b2e2888d698736bc62374d3e38fede53838e4781051f8547ae55ceca" exitCode=0 Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.442568 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerDied","Data":"759eb993b2e2888d698736bc62374d3e38fede53838e4781051f8547ae55ceca"} Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.463561 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerStarted","Data":"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7"} Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.463718 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-log" containerID="cri-o://798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1" gracePeriod=30 Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.464063 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-httpd" containerID="cri-o://5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7" gracePeriod=30 Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.482998 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerStarted","Data":"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20"} Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.492812 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.492796962 podStartE2EDuration="8.492796962s" podCreationTimestamp="2025-12-02 23:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:32.492586197 +0000 UTC m=+1191.201140480" watchObservedRunningTime="2025-12-02 23:17:32.492796962 +0000 UTC m=+1191.201351245" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.493285 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerStarted","Data":"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3"} Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.494055 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.498585 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-log" containerID="cri-o://8b12b12b802caa2d79cb3d0053c5ca1d25e59074f9ceff0c4035111c88e1cb1b" gracePeriod=30 Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.498887 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerStarted","Data":"71769cbebd3e4e54201fd9fecc6bf20aee89bf6756971496d1a112d213d4291e"} Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.499148 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-httpd" containerID="cri-o://71769cbebd3e4e54201fd9fecc6bf20aee89bf6756971496d1a112d213d4291e" gracePeriod=30 Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.546095 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.017993193 podStartE2EDuration="8.546080384s" podCreationTimestamp="2025-12-02 23:17:24 +0000 UTC" firstStartedPulling="2025-12-02 23:17:26.998695374 +0000 UTC m=+1185.707249657" lastFinishedPulling="2025-12-02 23:17:27.526782565 +0000 UTC m=+1186.235336848" observedRunningTime="2025-12-02 23:17:32.52080216 +0000 UTC m=+1191.229356433" watchObservedRunningTime="2025-12-02 23:17:32.546080384 +0000 UTC m=+1191.254634667" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.575996 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.588549 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.588531741 podStartE2EDuration="8.588531741s" podCreationTimestamp="2025-12-02 23:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:32.576124891 +0000 UTC m=+1191.284679164" watchObservedRunningTime="2025-12-02 23:17:32.588531741 +0000 UTC m=+1191.297086024" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.607902 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.607885246 podStartE2EDuration="8.607885246s" podCreationTimestamp="2025-12-02 23:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:32.606456513 +0000 UTC m=+1191.315010796" watchObservedRunningTime="2025-12-02 23:17:32.607885246 +0000 UTC m=+1191.316439529" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.954148 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78cfc5fdf8-p9576"] Dec 02 23:17:32 crc kubenswrapper[4903]: E1202 23:17:32.954812 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="dnsmasq-dns" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.954829 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="dnsmasq-dns" Dec 02 23:17:32 crc kubenswrapper[4903]: E1202 23:17:32.954852 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="init" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.954858 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="init" Dec 02 23:17:32 crc kubenswrapper[4903]: E1202 23:17:32.954880 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.954887 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon" Dec 02 23:17:32 crc kubenswrapper[4903]: E1202 23:17:32.954898 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon-log" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.954905 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon-log" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.955075 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.955091 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dbf052-03f5-4a2c-a8a7-86740787c1dc" containerName="horizon-log" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.955109 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aae219f-2594-44f2-9f58-2b3149e8edbb" containerName="dnsmasq-dns" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.956096 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.957932 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.960401 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 23:17:32 crc kubenswrapper[4903]: I1202 23:17:32.971092 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78cfc5fdf8-p9576"] Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.055605 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-public-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.055818 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-internal-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.055909 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.056003 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data-custom\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.056087 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-combined-ca-bundle\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.056216 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-logs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.056287 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmflc\" (UniqueName: \"kubernetes.io/projected/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-kube-api-access-bmflc\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158060 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-logs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmflc\" (UniqueName: \"kubernetes.io/projected/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-kube-api-access-bmflc\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158167 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-public-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158183 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-internal-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158199 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158239 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data-custom\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.158265 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-combined-ca-bundle\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.160976 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-logs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.164545 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.170324 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-combined-ca-bundle\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.170391 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-public-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.170923 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-config-data-custom\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.174217 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-internal-tls-certs\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.179389 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmflc\" (UniqueName: \"kubernetes.io/projected/1a80c66a-4cfd-44a2-a5e4-5a9297e63f29-kube-api-access-bmflc\") pod \"barbican-api-78cfc5fdf8-p9576\" (UID: \"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29\") " pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.284273 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.624934 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerID="71769cbebd3e4e54201fd9fecc6bf20aee89bf6756971496d1a112d213d4291e" exitCode=0 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.624966 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerID="8b12b12b802caa2d79cb3d0053c5ca1d25e59074f9ceff0c4035111c88e1cb1b" exitCode=143 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.696049 4903 generic.go:334] "Generic (PLEG): container finished" podID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerID="5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7" exitCode=0 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.696091 4903 generic.go:334] "Generic (PLEG): container finished" podID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerID="798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1" exitCode=143 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.696247 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api-log" containerID="cri-o://6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" gracePeriod=30 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.697239 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api" containerID="cri-o://b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" gracePeriod=30 Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.716619 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.734950 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerStarted","Data":"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a"} Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.734998 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerDied","Data":"71769cbebd3e4e54201fd9fecc6bf20aee89bf6756971496d1a112d213d4291e"} Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.735012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerDied","Data":"8b12b12b802caa2d79cb3d0053c5ca1d25e59074f9ceff0c4035111c88e1cb1b"} Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.735021 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerDied","Data":"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7"} Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.735033 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerDied","Data":"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1"} Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.735050 4903 scope.go:117] "RemoveContainer" containerID="5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.779466 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.437464515 podStartE2EDuration="10.77944755s" podCreationTimestamp="2025-12-02 23:17:23 +0000 UTC" firstStartedPulling="2025-12-02 23:17:26.09605115 +0000 UTC m=+1184.804605433" lastFinishedPulling="2025-12-02 23:17:32.438034185 +0000 UTC m=+1191.146588468" observedRunningTime="2025-12-02 23:17:33.695532317 +0000 UTC m=+1192.404086590" watchObservedRunningTime="2025-12-02 23:17:33.77944755 +0000 UTC m=+1192.488001833" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802288 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802350 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802375 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802425 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shngg\" (UniqueName: \"kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802559 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802586 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802669 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run\") pod \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\" (UID: \"7fbb83df-23bf-40a5-a3a7-ceafac9d783e\") " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.802815 4903 scope.go:117] "RemoveContainer" containerID="798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.818503 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs" (OuterVolumeSpecName: "logs") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.820476 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.860857 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.882823 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts" (OuterVolumeSpecName: "scripts") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.882931 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg" (OuterVolumeSpecName: "kube-api-access-shngg") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "kube-api-access-shngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.904534 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shngg\" (UniqueName: \"kubernetes.io/projected/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-kube-api-access-shngg\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.912927 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.913070 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.913129 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.913209 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.971429 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.976842 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78cfc5fdf8-p9576"] Dec 02 23:17:33 crc kubenswrapper[4903]: I1202 23:17:33.981261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.015239 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.015267 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.027865 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data" (OuterVolumeSpecName: "config-data") pod "7fbb83df-23bf-40a5-a3a7-ceafac9d783e" (UID: "7fbb83df-23bf-40a5-a3a7-ceafac9d783e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.046208 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.049921 4903 scope.go:117] "RemoveContainer" containerID="5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.050939 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7\": container with ID starting with 5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7 not found: ID does not exist" containerID="5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.050984 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7"} err="failed to get container status \"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7\": rpc error: code = NotFound desc = could not find container \"5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7\": container with ID starting with 5ce388d22f122f96271a834fea4f2f745997e7787350db009cbfa8d761d649b7 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.051013 4903 scope.go:117] "RemoveContainer" containerID="798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.056837 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1\": container with ID starting with 798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1 not found: ID does not exist" containerID="798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.056904 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1"} err="failed to get container status \"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1\": rpc error: code = NotFound desc = could not find container \"798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1\": container with ID starting with 798b731959d138fbe3024526ec7e7e2d5c70540c2ce19e5d886f8be95441cfe1 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116050 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116114 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116139 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhzrs\" (UniqueName: \"kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116184 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116209 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116299 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116342 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\" (UID: \"cf18444a-84f7-4d6a-85ac-7b0a75776ebc\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.116796 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbb83df-23bf-40a5-a3a7-ceafac9d783e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.117201 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs" (OuterVolumeSpecName: "logs") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.117395 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.121027 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs" (OuterVolumeSpecName: "kube-api-access-jhzrs") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "kube-api-access-jhzrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.134923 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.138141 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts" (OuterVolumeSpecName: "scripts") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.151431 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.201276 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data" (OuterVolumeSpecName: "config-data") pod "cf18444a-84f7-4d6a-85ac-7b0a75776ebc" (UID: "cf18444a-84f7-4d6a-85ac-7b0a75776ebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218919 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218942 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218950 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218959 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhzrs\" (UniqueName: \"kubernetes.io/projected/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-kube-api-access-jhzrs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218970 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218978 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.218988 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf18444a-84f7-4d6a-85ac-7b0a75776ebc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.251582 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.323951 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.443609 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.525990 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzm6g\" (UniqueName: \"kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526045 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526127 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526188 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526225 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526281 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle\") pod \"7a5779cf-3693-4b42-8726-363b548c4071\" (UID: \"7a5779cf-3693-4b42-8726-363b548c4071\") " Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526382 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526523 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs" (OuterVolumeSpecName: "logs") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526709 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a5779cf-3693-4b42-8726-363b548c4071-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.526722 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5779cf-3693-4b42-8726-363b548c4071-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.530314 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.534746 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts" (OuterVolumeSpecName: "scripts") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.543567 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g" (OuterVolumeSpecName: "kube-api-access-tzm6g") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "kube-api-access-tzm6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.564756 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.583775 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data" (OuterVolumeSpecName: "config-data") pod "7a5779cf-3693-4b42-8726-363b548c4071" (UID: "7a5779cf-3693-4b42-8726-363b548c4071"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.628913 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.628941 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzm6g\" (UniqueName: \"kubernetes.io/projected/7a5779cf-3693-4b42-8726-363b548c4071-kube-api-access-tzm6g\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.628952 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.628960 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.628968 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5779cf-3693-4b42-8726-363b548c4071-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.705843 4903 generic.go:334] "Generic (PLEG): container finished" podID="7a5779cf-3693-4b42-8726-363b548c4071" containerID="b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" exitCode=0 Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.705889 4903 generic.go:334] "Generic (PLEG): container finished" podID="7a5779cf-3693-4b42-8726-363b548c4071" containerID="6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" exitCode=143 Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.705896 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.705946 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerDied","Data":"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.706007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerDied","Data":"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.706020 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a5779cf-3693-4b42-8726-363b548c4071","Type":"ContainerDied","Data":"bb1e9b0b5dc665fd127a4253fbf40e3901c1b7ba5e2fb1fc09297cebbb951d47"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.706024 4903 scope.go:117] "RemoveContainer" containerID="b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.709471 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf18444a-84f7-4d6a-85ac-7b0a75776ebc","Type":"ContainerDied","Data":"781986d9e1b57fac36ed65f5e48411a57596f59d79920da9750ddaffb2dea1a9"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.709593 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.721137 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78cfc5fdf8-p9576" event={"ID":"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29","Type":"ContainerStarted","Data":"6f8aa89dbd72dfcca9ca04417bfeacd0b7087b7c2b337959cd5e2688d7623368"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.721292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78cfc5fdf8-p9576" event={"ID":"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29","Type":"ContainerStarted","Data":"abca8b00b0ad67a5fd3c06e21c94cb3d3ad53b32f1f5fb17043cfb060b8720c3"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.721357 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78cfc5fdf8-p9576" event={"ID":"1a80c66a-4cfd-44a2-a5e4-5a9297e63f29","Type":"ContainerStarted","Data":"ec26fa1b527bfdb7d0a933e93cce0c193e5c06031b5b263ab73e7c67ee3fa8b2"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.723566 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.725494 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.725530 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7fbb83df-23bf-40a5-a3a7-ceafac9d783e","Type":"ContainerDied","Data":"498d547e8f4f7c23385adae389a1612dbbcf63f2702330f41cab110b10b282d5"} Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.743309 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.745690 4903 scope.go:117] "RemoveContainer" containerID="6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.759935 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772168 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772602 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772619 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772641 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772660 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772683 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772690 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772711 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772717 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772732 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772738 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api-log" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.772753 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772760 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772918 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772934 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772949 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-httpd" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772961 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772971 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5779cf-3693-4b42-8726-363b548c4071" containerName="cinder-api-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.772979 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" containerName="glance-log" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.774011 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.774347 4903 scope.go:117] "RemoveContainer" containerID="b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778122 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778347 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.778463 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3\": container with ID starting with b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3 not found: ID does not exist" containerID="b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778499 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3"} err="failed to get container status \"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3\": rpc error: code = NotFound desc = could not find container \"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3\": container with ID starting with b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778526 4903 scope.go:117] "RemoveContainer" containerID="6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778567 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 23:17:34 crc kubenswrapper[4903]: E1202 23:17:34.778889 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1\": container with ID starting with 6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1 not found: ID does not exist" containerID="6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778910 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1"} err="failed to get container status \"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1\": rpc error: code = NotFound desc = could not find container \"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1\": container with ID starting with 6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.778923 4903 scope.go:117] "RemoveContainer" containerID="b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.781049 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3"} err="failed to get container status \"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3\": rpc error: code = NotFound desc = could not find container \"b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3\": container with ID starting with b5f68f7307471159e36d94d4a66f49537890f021cbae8438b8573b5666321df3 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.781085 4903 scope.go:117] "RemoveContainer" containerID="6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.781109 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78cfc5fdf8-p9576" podStartSLOduration=2.781092329 podStartE2EDuration="2.781092329s" podCreationTimestamp="2025-12-02 23:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:34.755034687 +0000 UTC m=+1193.463588970" watchObservedRunningTime="2025-12-02 23:17:34.781092329 +0000 UTC m=+1193.489646612" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.782444 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1"} err="failed to get container status \"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1\": rpc error: code = NotFound desc = could not find container \"6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1\": container with ID starting with 6ea7b5cebcb60e59811afcaf169213334e5eb7a1a660ce946ddbef4b2b375bd1 not found: ID does not exist" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.782482 4903 scope.go:117] "RemoveContainer" containerID="71769cbebd3e4e54201fd9fecc6bf20aee89bf6756971496d1a112d213d4291e" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.800076 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.812857 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836225 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-scripts\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836269 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74lw\" (UniqueName: \"kubernetes.io/projected/f98dfcd8-1365-42c3-b939-c34ad3325a09-kube-api-access-v74lw\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836354 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836390 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f98dfcd8-1365-42c3-b939-c34ad3325a09-logs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836435 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836506 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data-custom\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836531 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f98dfcd8-1365-42c3-b939-c34ad3325a09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836568 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.836614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.866964 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.876510 4903 scope.go:117] "RemoveContainer" containerID="8b12b12b802caa2d79cb3d0053c5ca1d25e59074f9ceff0c4035111c88e1cb1b" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.874323 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.883111 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.883974 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.885293 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.887035 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.887202 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.887967 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.888197 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h4s5l" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.897304 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.930228 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.936067 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938194 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f98dfcd8-1365-42c3-b939-c34ad3325a09-logs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938235 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938292 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data-custom\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938310 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f98dfcd8-1365-42c3-b939-c34ad3325a09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938330 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938372 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938390 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-scripts\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938412 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v74lw\" (UniqueName: \"kubernetes.io/projected/f98dfcd8-1365-42c3-b939-c34ad3325a09-kube-api-access-v74lw\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.938488 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.939081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f98dfcd8-1365-42c3-b939-c34ad3325a09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.939384 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f98dfcd8-1365-42c3-b939-c34ad3325a09-logs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.944756 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.947931 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.949198 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.953876 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-config-data-custom\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.954338 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.962038 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f98dfcd8-1365-42c3-b939-c34ad3325a09-scripts\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.978347 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.985677 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.987351 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.987472 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.989128 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:17:34 crc kubenswrapper[4903]: I1202 23:17:34.993375 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.044885 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74lw\" (UniqueName: \"kubernetes.io/projected/f98dfcd8-1365-42c3-b939-c34ad3325a09-kube-api-access-v74lw\") pod \"cinder-api-0\" (UID: \"f98dfcd8-1365-42c3-b939-c34ad3325a09\") " pod="openstack/cinder-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.048822 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.054506 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4qw\" (UniqueName: \"kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.054578 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.054602 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.054641 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9st8\" (UniqueName: \"kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.054796 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056581 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056628 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056682 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056868 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.056913 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.057030 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.057048 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.057086 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.057119 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.057136 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.106146 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160216 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160272 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160323 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160366 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160385 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160400 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160425 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4qw\" (UniqueName: \"kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160443 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160457 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160476 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9st8\" (UniqueName: \"kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160501 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160523 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160540 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160559 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.160577 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.163494 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.165790 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.166061 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.167111 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.167396 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.167694 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.168251 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.168444 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.168774 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.174464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.179718 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.180183 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.180393 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.181000 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.183239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4qw\" (UniqueName: \"kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.192775 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.196580 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9st8\" (UniqueName: \"kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.227751 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.229869 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.243135 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.243397 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="dnsmasq-dns" containerID="cri-o://920bbcefa89e3ce771b253c8a556db24084880f816bbb44221a4d9a556596686" gracePeriod=10 Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.339996 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.534001 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.625427 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5779cf-3693-4b42-8726-363b548c4071" path="/var/lib/kubelet/pods/7a5779cf-3693-4b42-8726-363b548c4071/volumes" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.626619 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbb83df-23bf-40a5-a3a7-ceafac9d783e" path="/var/lib/kubelet/pods/7fbb83df-23bf-40a5-a3a7-ceafac9d783e/volumes" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.627782 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf18444a-84f7-4d6a-85ac-7b0a75776ebc" path="/var/lib/kubelet/pods/cf18444a-84f7-4d6a-85ac-7b0a75776ebc/volumes" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.661022 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.741318 4903 generic.go:334] "Generic (PLEG): container finished" podID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerID="a436f02befb85b1592a3221ab644954bf4f4e6026430927df464c7bfa7562de5" exitCode=0 Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.741419 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerDied","Data":"a436f02befb85b1592a3221ab644954bf4f4e6026430927df464c7bfa7562de5"} Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.754224 4903 generic.go:334] "Generic (PLEG): container finished" podID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerID="920bbcefa89e3ce771b253c8a556db24084880f816bbb44221a4d9a556596686" exitCode=0 Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.754319 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerDied","Data":"920bbcefa89e3ce771b253c8a556db24084880f816bbb44221a4d9a556596686"} Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.790057 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f98dfcd8-1365-42c3-b939-c34ad3325a09","Type":"ContainerStarted","Data":"32d752b48288a15979d7a61cc57940c3282d5dc6cdfa6c65c24e433274e029ff"} Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.804360 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.804393 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.868501 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.962419 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988630 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988702 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988729 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988748 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988910 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:35 crc kubenswrapper[4903]: I1202 23:17:35.988927 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c97v5\" (UniqueName: \"kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.005746 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5" (OuterVolumeSpecName: "kube-api-access-c97v5") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "kube-api-access-c97v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.047094 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.049887 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config" (OuterVolumeSpecName: "config") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.078942 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090039 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090081 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle\") pod \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090272 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkfwt\" (UniqueName: \"kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt\") pod \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090382 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config\") pod \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090562 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs\") pod \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090588 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config\") pod \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\" (UID: \"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.090670 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") pod \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\" (UID: \"8af477b6-59d4-4909-9cb6-b9e61f75bd96\") " Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.091059 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.091076 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.091086 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.091095 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c97v5\" (UniqueName: \"kubernetes.io/projected/8af477b6-59d4-4909-9cb6-b9e61f75bd96-kube-api-access-c97v5\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: W1202 23:17:36.091163 4903 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8af477b6-59d4-4909-9cb6-b9e61f75bd96/volumes/kubernetes.io~configmap/dns-svc Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.091176 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.098150 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" (UID: "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.098266 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt" (OuterVolumeSpecName: "kube-api-access-xkfwt") pod "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" (UID: "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8"). InnerVolumeSpecName "kube-api-access-xkfwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.103806 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8af477b6-59d4-4909-9cb6-b9e61f75bd96" (UID: "8af477b6-59d4-4909-9cb6-b9e61f75bd96"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.173425 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" (UID: "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.193414 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.193444 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8af477b6-59d4-4909-9cb6-b9e61f75bd96-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.193454 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.193462 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkfwt\" (UniqueName: \"kubernetes.io/projected/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-kube-api-access-xkfwt\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.193472 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.203431 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.214981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config" (OuterVolumeSpecName: "config") pod "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" (UID: "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.220737 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" (UID: "b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.296480 4903 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.296749 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.342133 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:17:36 crc kubenswrapper[4903]: W1202 23:17:36.351957 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67d56d30_2369_440d_8e83_2c424e0a79af.slice/crio-7608660724b5677ddfe1e7930d9e7fcc911c197c2d4f807be6b36834d7da1bc1 WatchSource:0}: Error finding container 7608660724b5677ddfe1e7930d9e7fcc911c197c2d4f807be6b36834d7da1bc1: Status 404 returned error can't find the container with id 7608660724b5677ddfe1e7930d9e7fcc911c197c2d4f807be6b36834d7da1bc1 Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.812837 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f98dfcd8-1365-42c3-b939-c34ad3325a09","Type":"ContainerStarted","Data":"55a91074e8f1ed2e4f2242069cdb8dd088da30af6f18546c77b2735b3e8ee3f4"} Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.814089 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerStarted","Data":"c03c1703fe5fef3d372f4aac7b2a9862991178e92f0ad968573dc0e36ad313c6"} Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.814853 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerStarted","Data":"7608660724b5677ddfe1e7930d9e7fcc911c197c2d4f807be6b36834d7da1bc1"} Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.816291 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d59fbfc6-p8t8m" event={"ID":"b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8","Type":"ContainerDied","Data":"0b81c9432e70708289bdb623e40b54cea53b982e059cd522ef6eac0133e68111"} Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.816338 4903 scope.go:117] "RemoveContainer" containerID="759eb993b2e2888d698736bc62374d3e38fede53838e4781051f8547ae55ceca" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.816467 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d59fbfc6-p8t8m" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.821793 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.821831 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff5f5497-x5nsj" event={"ID":"8af477b6-59d4-4909-9cb6-b9e61f75bd96","Type":"ContainerDied","Data":"d8837153c3ae62ed7623c378948ba876860e8d71e1f9a55adb4a2067ddf58cc1"} Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.845904 4903 scope.go:117] "RemoveContainer" containerID="a436f02befb85b1592a3221ab644954bf4f4e6026430927df464c7bfa7562de5" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.876459 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.887811 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76ff5f5497-x5nsj"] Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.902574 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.912307 4903 scope.go:117] "RemoveContainer" containerID="920bbcefa89e3ce771b253c8a556db24084880f816bbb44221a4d9a556596686" Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.920951 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77d59fbfc6-p8t8m"] Dec 02 23:17:36 crc kubenswrapper[4903]: I1202 23:17:36.946410 4903 scope.go:117] "RemoveContainer" containerID="6ef85280faeb898a19ebaf725c3f47ef89cc2ff15b7a801d95fdd8542b49a49b" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.622295 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" path="/var/lib/kubelet/pods/8af477b6-59d4-4909-9cb6-b9e61f75bd96/volumes" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.623320 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" path="/var/lib/kubelet/pods/b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8/volumes" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.834813 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerStarted","Data":"76db4ca849eca8c661ab292eedfd14008171a54a0747348725127fd0cb9f6171"} Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.835085 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerStarted","Data":"0a295593ab5c744e75bf7437de07c7f3e8c3fd2d4c8c0c48d9b8748df5760742"} Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.850743 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f98dfcd8-1365-42c3-b939-c34ad3325a09","Type":"ContainerStarted","Data":"810a3c728598566f6ff78847207169676c0be0bfc08f47d2db0f139684843cb9"} Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.850914 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.860840 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerStarted","Data":"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9"} Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.860886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerStarted","Data":"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315"} Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.865004 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.864988575 podStartE2EDuration="3.864988575s" podCreationTimestamp="2025-12-02 23:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:37.861840881 +0000 UTC m=+1196.570395174" watchObservedRunningTime="2025-12-02 23:17:37.864988575 +0000 UTC m=+1196.573542858" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.903847 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.903829178 podStartE2EDuration="3.903829178s" podCreationTimestamp="2025-12-02 23:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:37.893118986 +0000 UTC m=+1196.601673269" watchObservedRunningTime="2025-12-02 23:17:37.903829178 +0000 UTC m=+1196.612383461" Dec 02 23:17:37 crc kubenswrapper[4903]: I1202 23:17:37.922027 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.922006925 podStartE2EDuration="3.922006925s" podCreationTimestamp="2025-12-02 23:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:37.916920435 +0000 UTC m=+1196.625474718" watchObservedRunningTime="2025-12-02 23:17:37.922006925 +0000 UTC m=+1196.630561198" Dec 02 23:17:38 crc kubenswrapper[4903]: I1202 23:17:38.966607 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:38 crc kubenswrapper[4903]: I1202 23:17:38.966707 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:38 crc kubenswrapper[4903]: I1202 23:17:38.967789 4903 scope.go:117] "RemoveContainer" containerID="df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec" Dec 02 23:17:39 crc kubenswrapper[4903]: I1202 23:17:39.893637 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005"} Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.061972 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.099581 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.901080 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="cinder-scheduler" containerID="cri-o://409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5" gracePeriod=30 Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.901758 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="probe" containerID="cri-o://b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20" gracePeriod=30 Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.926995 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:40 crc kubenswrapper[4903]: I1202 23:17:40.989280 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-665fcbdbd4-lvt55" Dec 02 23:17:41 crc kubenswrapper[4903]: I1202 23:17:41.915457 4903 generic.go:334] "Generic (PLEG): container finished" podID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerID="b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20" exitCode=0 Dec 02 23:17:41 crc kubenswrapper[4903]: I1202 23:17:41.915558 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerDied","Data":"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20"} Dec 02 23:17:42 crc kubenswrapper[4903]: I1202 23:17:42.929615 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerID="ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005" exitCode=1 Dec 02 23:17:42 crc kubenswrapper[4903]: I1202 23:17:42.930297 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005"} Dec 02 23:17:42 crc kubenswrapper[4903]: I1202 23:17:42.930421 4903 scope.go:117] "RemoveContainer" containerID="df15c3c36b70884fde2426f34900864d205c0624b857ec4d7f83ed6fd31393ec" Dec 02 23:17:42 crc kubenswrapper[4903]: I1202 23:17:42.931266 4903 scope.go:117] "RemoveContainer" containerID="ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005" Dec 02 23:17:42 crc kubenswrapper[4903]: E1202 23:17:42.931698 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:17:43 crc kubenswrapper[4903]: I1202 23:17:43.015251 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-74c5f59c6f-5gx9d" Dec 02 23:17:44 crc kubenswrapper[4903]: I1202 23:17:44.610672 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:44 crc kubenswrapper[4903]: I1202 23:17:44.680312 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78cfc5fdf8-p9576" Dec 02 23:17:44 crc kubenswrapper[4903]: I1202 23:17:44.744163 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:44 crc kubenswrapper[4903]: I1202 23:17:44.744414 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86bbdbcfd-94wnf" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api-log" containerID="cri-o://06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173" gracePeriod=30 Dec 02 23:17:44 crc kubenswrapper[4903]: I1202 23:17:44.744547 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86bbdbcfd-94wnf" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api" containerID="cri-o://79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68" gracePeriod=30 Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.005193 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fd47c645b-9wf6m" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.006817 4903 generic.go:334] "Generic (PLEG): container finished" podID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerID="06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173" exitCode=143 Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.006877 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerDied","Data":"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173"} Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.017108 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.020567 4903 generic.go:334] "Generic (PLEG): container finished" podID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerID="409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5" exitCode=0 Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.021284 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerDied","Data":"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5"} Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.021357 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c35ad387-fed0-4cbc-9912-c17aab93860a","Type":"ContainerDied","Data":"9719688f6d232c8c0923e688ee0723c6eec00b5c07cd9a49faf0dbd99a9564b2"} Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.021374 4903 scope.go:117] "RemoveContainer" containerID="b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.062807 4903 scope.go:117] "RemoveContainer" containerID="409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.108855 4903 scope.go:117] "RemoveContainer" containerID="b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20" Dec 02 23:17:45 crc kubenswrapper[4903]: E1202 23:17:45.109248 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20\": container with ID starting with b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20 not found: ID does not exist" containerID="b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.109279 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20"} err="failed to get container status \"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20\": rpc error: code = NotFound desc = could not find container \"b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20\": container with ID starting with b9b50eee02969239c4bea241d7e0e370e59d7dc985a050a93d269a3ea3f29d20 not found: ID does not exist" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.109299 4903 scope.go:117] "RemoveContainer" containerID="409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5" Dec 02 23:17:45 crc kubenswrapper[4903]: E1202 23:17:45.109675 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5\": container with ID starting with 409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5 not found: ID does not exist" containerID="409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.109696 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5"} err="failed to get container status \"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5\": rpc error: code = NotFound desc = could not find container \"409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5\": container with ID starting with 409181bfb5f67d8323510af2392b3d2db0db6d349695fbf7b79a36e6587af2c5 not found: ID does not exist" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.209919 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210014 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210100 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210146 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrns8\" (UniqueName: \"kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210178 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210230 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210305 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle\") pod \"c35ad387-fed0-4cbc-9912-c17aab93860a\" (UID: \"c35ad387-fed0-4cbc-9912-c17aab93860a\") " Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.210773 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c35ad387-fed0-4cbc-9912-c17aab93860a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.218799 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.224300 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts" (OuterVolumeSpecName: "scripts") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.243107 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8" (OuterVolumeSpecName: "kube-api-access-mrns8") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "kube-api-access-mrns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.313203 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.313244 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrns8\" (UniqueName: \"kubernetes.io/projected/c35ad387-fed0-4cbc-9912-c17aab93860a-kube-api-access-mrns8\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.313254 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.338806 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.342820 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.343826 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.383326 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.387628 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.415609 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.473231 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data" (OuterVolumeSpecName: "config-data") pod "c35ad387-fed0-4cbc-9912-c17aab93860a" (UID: "c35ad387-fed0-4cbc-9912-c17aab93860a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.517397 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35ad387-fed0-4cbc-9912-c17aab93860a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.534810 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.534874 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.578290 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:45 crc kubenswrapper[4903]: I1202 23:17:45.594604 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.036003 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.036987 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.038590 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.038614 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.038625 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.063303 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.072631 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.090794 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091168 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="init" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091184 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="init" Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091196 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-api" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091202 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-api" Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091214 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="cinder-scheduler" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091240 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="cinder-scheduler" Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091254 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="dnsmasq-dns" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091260 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="dnsmasq-dns" Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091286 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-httpd" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091294 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-httpd" Dec 02 23:17:46 crc kubenswrapper[4903]: E1202 23:17:46.091313 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="probe" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091318 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="probe" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091487 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="probe" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091500 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" containerName="cinder-scheduler" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091511 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af477b6-59d4-4909-9cb6-b9e61f75bd96" containerName="dnsmasq-dns" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091529 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-httpd" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.091543 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01b64d4-afb8-4cf8-b0e3-1ef2105e53d8" containerName="neutron-api" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.092502 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.094717 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.105366 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.230707 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.230959 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.230988 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.231006 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcsg\" (UniqueName: \"kubernetes.io/projected/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-kube-api-access-pmcsg\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.231076 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.231106 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333407 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcsg\" (UniqueName: \"kubernetes.io/projected/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-kube-api-access-pmcsg\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333565 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.333763 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.338140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.338929 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.339112 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.342016 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.353210 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcsg\" (UniqueName: \"kubernetes.io/projected/abeb15a2-9a82-49c1-bfdc-bc65cd1920f0-kube-api-access-pmcsg\") pod \"cinder-scheduler-0\" (UID: \"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0\") " pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.415808 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 23:17:46 crc kubenswrapper[4903]: I1202 23:17:46.936055 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.010434 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.049401 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom\") pod \"c9f65286-f806-4579-b98a-6a88a6dc8839\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.050493 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data\") pod \"c9f65286-f806-4579-b98a-6a88a6dc8839\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.050890 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle\") pod \"c9f65286-f806-4579-b98a-6a88a6dc8839\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.051092 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msqr7\" (UniqueName: \"kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7\") pod \"c9f65286-f806-4579-b98a-6a88a6dc8839\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.051358 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs\") pod \"c9f65286-f806-4579-b98a-6a88a6dc8839\" (UID: \"c9f65286-f806-4579-b98a-6a88a6dc8839\") " Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.051972 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs" (OuterVolumeSpecName: "logs") pod "c9f65286-f806-4579-b98a-6a88a6dc8839" (UID: "c9f65286-f806-4579-b98a-6a88a6dc8839"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.055635 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0","Type":"ContainerStarted","Data":"2e46b56e80ae988b85cb068985924f1576e15f91c9d21b8a5dae2531fbb5b3a6"} Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.057399 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9f65286-f806-4579-b98a-6a88a6dc8839" (UID: "c9f65286-f806-4579-b98a-6a88a6dc8839"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.061156 4903 generic.go:334] "Generic (PLEG): container finished" podID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerID="79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68" exitCode=0 Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.061373 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bbdbcfd-94wnf" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.061527 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerDied","Data":"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68"} Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.061580 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bbdbcfd-94wnf" event={"ID":"c9f65286-f806-4579-b98a-6a88a6dc8839","Type":"ContainerDied","Data":"6ba8e20791a0ae9152c8f20f7c8fc9681a9d41da237dc1b7e3439ed320afbc9a"} Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.061603 4903 scope.go:117] "RemoveContainer" containerID="79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.066812 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7" (OuterVolumeSpecName: "kube-api-access-msqr7") pod "c9f65286-f806-4579-b98a-6a88a6dc8839" (UID: "c9f65286-f806-4579-b98a-6a88a6dc8839"). InnerVolumeSpecName "kube-api-access-msqr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.133334 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9f65286-f806-4579-b98a-6a88a6dc8839" (UID: "c9f65286-f806-4579-b98a-6a88a6dc8839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.136673 4903 scope.go:117] "RemoveContainer" containerID="06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.153445 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msqr7\" (UniqueName: \"kubernetes.io/projected/c9f65286-f806-4579-b98a-6a88a6dc8839-kube-api-access-msqr7\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.153474 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f65286-f806-4579-b98a-6a88a6dc8839-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.153483 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.153492 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.154139 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data" (OuterVolumeSpecName: "config-data") pod "c9f65286-f806-4579-b98a-6a88a6dc8839" (UID: "c9f65286-f806-4579-b98a-6a88a6dc8839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.176025 4903 scope.go:117] "RemoveContainer" containerID="79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68" Dec 02 23:17:47 crc kubenswrapper[4903]: E1202 23:17:47.178241 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68\": container with ID starting with 79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68 not found: ID does not exist" containerID="79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.178330 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68"} err="failed to get container status \"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68\": rpc error: code = NotFound desc = could not find container \"79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68\": container with ID starting with 79c834c82ba85613ce9c953f0e3876400f0d14d0c1a29b707525256cb9663d68 not found: ID does not exist" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.178402 4903 scope.go:117] "RemoveContainer" containerID="06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173" Dec 02 23:17:47 crc kubenswrapper[4903]: E1202 23:17:47.179590 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173\": container with ID starting with 06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173 not found: ID does not exist" containerID="06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.179633 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173"} err="failed to get container status \"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173\": rpc error: code = NotFound desc = could not find container \"06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173\": container with ID starting with 06c700a4fc42d3fa44fbb7ca68913f255e39c5293e1806aeb8effea15d97b173 not found: ID does not exist" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.255376 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f65286-f806-4579-b98a-6a88a6dc8839-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.416847 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.425525 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86bbdbcfd-94wnf"] Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.638846 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35ad387-fed0-4cbc-9912-c17aab93860a" path="/var/lib/kubelet/pods/c35ad387-fed0-4cbc-9912-c17aab93860a/volumes" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.639728 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" path="/var/lib/kubelet/pods/c9f65286-f806-4579-b98a-6a88a6dc8839/volumes" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.718547 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 23:17:47 crc kubenswrapper[4903]: E1202 23:17:47.718929 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api-log" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.718947 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api-log" Dec 02 23:17:47 crc kubenswrapper[4903]: E1202 23:17:47.718967 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.718974 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.719191 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.719216 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f65286-f806-4579-b98a-6a88a6dc8839" containerName="barbican-api-log" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.720004 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.723477 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.723537 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rxtdp" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.723640 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.756797 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.770645 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.770865 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb75z\" (UniqueName: \"kubernetes.io/projected/01e0132f-dfe4-4d3a-9a72-b38b77521ada-kube-api-access-nb75z\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.770940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.770994 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-combined-ca-bundle\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.807317 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.876741 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb75z\" (UniqueName: \"kubernetes.io/projected/01e0132f-dfe4-4d3a-9a72-b38b77521ada-kube-api-access-nb75z\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.877097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.877125 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-combined-ca-bundle\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.877185 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.886907 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.900453 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-openstack-config-secret\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.900489 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e0132f-dfe4-4d3a-9a72-b38b77521ada-combined-ca-bundle\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:47 crc kubenswrapper[4903]: I1202 23:17:47.904340 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb75z\" (UniqueName: \"kubernetes.io/projected/01e0132f-dfe4-4d3a-9a72-b38b77521ada-kube-api-access-nb75z\") pod \"openstackclient\" (UID: \"01e0132f-dfe4-4d3a-9a72-b38b77521ada\") " pod="openstack/openstackclient" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.044065 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.098922 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.098956 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.100031 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0","Type":"ContainerStarted","Data":"a94d6d87986849a4834d4a6f0894872ccb7cfbbcd0c03f51b1f6cc09d71ced40"} Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.100100 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.100112 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.587878 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 23:17:48 crc kubenswrapper[4903]: W1202 23:17:48.601410 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e0132f_dfe4_4d3a_9a72_b38b77521ada.slice/crio-bd83bcc1169a43d50577a791227832d1b251798f2046b6c11db65f029d0f871d WatchSource:0}: Error finding container bd83bcc1169a43d50577a791227832d1b251798f2046b6c11db65f029d0f871d: Status 404 returned error can't find the container with id bd83bcc1169a43d50577a791227832d1b251798f2046b6c11db65f029d0f871d Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.942376 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5489fffdb5-zmhmz"] Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.946361 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.957069 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.957269 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.957392 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.963368 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5489fffdb5-zmhmz"] Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.967958 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.968000 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:17:48 crc kubenswrapper[4903]: I1202 23:17:48.968736 4903 scope.go:117] "RemoveContainer" containerID="ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005" Dec 02 23:17:48 crc kubenswrapper[4903]: E1202 23:17:48.968964 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-etc-swift\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011670 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-internal-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011696 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqhm7\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-kube-api-access-tqhm7\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-combined-ca-bundle\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011769 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-config-data\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011826 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-log-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011845 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-run-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.011864 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-public-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.098375 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114634 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-internal-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqhm7\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-kube-api-access-tqhm7\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114723 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-combined-ca-bundle\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114782 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-config-data\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114820 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-log-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114842 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-run-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-public-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.114905 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-etc-swift\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.117256 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-run-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.117479 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0679a7f8-6bae-4619-b633-ae583358eda7-log-httpd\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.122464 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"abeb15a2-9a82-49c1-bfdc-bc65cd1920f0","Type":"ContainerStarted","Data":"bcae3887d25b56a85ec9c2302f968baee5682ece9cbeb099aa444684551a4e2d"} Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.127143 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-etc-swift\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.127578 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-internal-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.130295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-combined-ca-bundle\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.130403 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-config-data\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.133078 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0679a7f8-6bae-4619-b633-ae583358eda7-public-tls-certs\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.135908 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.135881 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"01e0132f-dfe4-4d3a-9a72-b38b77521ada","Type":"ContainerStarted","Data":"bd83bcc1169a43d50577a791227832d1b251798f2046b6c11db65f029d0f871d"} Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.140618 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqhm7\" (UniqueName: \"kubernetes.io/projected/0679a7f8-6bae-4619-b633-ae583358eda7-kube-api-access-tqhm7\") pod \"swift-proxy-5489fffdb5-zmhmz\" (UID: \"0679a7f8-6bae-4619-b633-ae583358eda7\") " pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.151770 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.245848 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.245828277 podStartE2EDuration="3.245828277s" podCreationTimestamp="2025-12-02 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:49.238085616 +0000 UTC m=+1207.946639899" watchObservedRunningTime="2025-12-02 23:17:49.245828277 +0000 UTC m=+1207.954382560" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.277737 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.540937 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.541039 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 23:17:49 crc kubenswrapper[4903]: I1202 23:17:49.549337 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:49.981013 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5489fffdb5-zmhmz"] Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.074793 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.075311 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-central-agent" containerID="cri-o://b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b" gracePeriod=30 Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.075883 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="sg-core" containerID="cri-o://2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7" gracePeriod=30 Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.075949 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="proxy-httpd" containerID="cri-o://474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a" gracePeriod=30 Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.075999 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-notification-agent" containerID="cri-o://603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5" gracePeriod=30 Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.166545 4903 generic.go:334] "Generic (PLEG): container finished" podID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerID="ec86184f533730ef740f956a3a55cdef5876deb56101395eba447b01c1c43ce9" exitCode=137 Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.166618 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerDied","Data":"ec86184f533730ef740f956a3a55cdef5876deb56101395eba447b01c1c43ce9"} Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.166642 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fd47c645b-9wf6m" event={"ID":"5b5f4367-359b-4633-80f9-0af5ac406aa4","Type":"ContainerDied","Data":"a79d667890f2549c62842ff723ef2d42ac1b121ce0e74d30f4d1ca66e7226567"} Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.166764 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79d667890f2549c62842ff723ef2d42ac1b121ce0e74d30f4d1ca66e7226567" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.180159 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5489fffdb5-zmhmz" event={"ID":"0679a7f8-6bae-4619-b633-ae583358eda7","Type":"ContainerStarted","Data":"b96321a35ce02592a7e94aee335d76e9d05a681a0efa51f7e54b2a1d0f57cdfc"} Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.202746 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": read tcp 10.217.0.2:36094->10.217.0.173:3000: read: connection reset by peer" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.238696 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347238 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347542 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347724 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347776 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhl6\" (UniqueName: \"kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347827 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347861 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.347916 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs\") pod \"5b5f4367-359b-4633-80f9-0af5ac406aa4\" (UID: \"5b5f4367-359b-4633-80f9-0af5ac406aa4\") " Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.355936 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs" (OuterVolumeSpecName: "logs") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.363845 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.365869 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6" (OuterVolumeSpecName: "kube-api-access-4lhl6") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "kube-api-access-4lhl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.378598 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts" (OuterVolumeSpecName: "scripts") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.385867 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.411788 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data" (OuterVolumeSpecName: "config-data") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.433879 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5b5f4367-359b-4633-80f9-0af5ac406aa4" (UID: "5b5f4367-359b-4633-80f9-0af5ac406aa4"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450519 4903 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450574 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhl6\" (UniqueName: \"kubernetes.io/projected/5b5f4367-359b-4633-80f9-0af5ac406aa4-kube-api-access-4lhl6\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450589 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450601 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b5f4367-359b-4633-80f9-0af5ac406aa4-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450614 4903 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450626 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b5f4367-359b-4633-80f9-0af5ac406aa4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:50 crc kubenswrapper[4903]: I1202 23:17:50.450637 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5f4367-359b-4633-80f9-0af5ac406aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.198869 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerID="474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a" exitCode=0 Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.199699 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerID="2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7" exitCode=2 Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.199713 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerID="b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b" exitCode=0 Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.198905 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerDied","Data":"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a"} Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.199781 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerDied","Data":"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7"} Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.199793 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerDied","Data":"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b"} Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.203663 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fd47c645b-9wf6m" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.203694 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5489fffdb5-zmhmz" event={"ID":"0679a7f8-6bae-4619-b633-ae583358eda7","Type":"ContainerStarted","Data":"c40ce7aa3a4ac017bd6c279d01931d3f94c9b54fef52792f0c1a7d34ab1ad446"} Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.203712 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5489fffdb5-zmhmz" event={"ID":"0679a7f8-6bae-4619-b633-ae583358eda7","Type":"ContainerStarted","Data":"60c6a2a20aaf8f968b790ce85c891a26c8c9a7e29b7d1e339f228bcdb984bf6c"} Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.203740 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.203762 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.224295 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5489fffdb5-zmhmz" podStartSLOduration=3.224280193 podStartE2EDuration="3.224280193s" podCreationTimestamp="2025-12-02 23:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:17:51.21946355 +0000 UTC m=+1209.928017833" watchObservedRunningTime="2025-12-02 23:17:51.224280193 +0000 UTC m=+1209.932834476" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.245542 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.257311 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fd47c645b-9wf6m"] Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.416027 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 23:17:51 crc kubenswrapper[4903]: I1202 23:17:51.625856 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" path="/var/lib/kubelet/pods/5b5f4367-359b-4633-80f9-0af5ac406aa4/volumes" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.237258 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.238679 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerID="603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5" exitCode=0 Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.238749 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerDied","Data":"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5"} Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.238787 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0cf0be-7840-4684-ada4-16ea2d6351f3","Type":"ContainerDied","Data":"779059428bd547a3eafc17465247cbc45952ca7ca3e2507e08d1480a6ec5ff78"} Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.238804 4903 scope.go:117] "RemoveContainer" containerID="474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.294254 4903 scope.go:117] "RemoveContainer" containerID="2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.342413 4903 scope.go:117] "RemoveContainer" containerID="603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.364024 4903 scope.go:117] "RemoveContainer" containerID="b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.382895 4903 scope.go:117] "RemoveContainer" containerID="474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a" Dec 02 23:17:52 crc kubenswrapper[4903]: E1202 23:17:52.383330 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a\": container with ID starting with 474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a not found: ID does not exist" containerID="474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.383397 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a"} err="failed to get container status \"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a\": rpc error: code = NotFound desc = could not find container \"474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a\": container with ID starting with 474802f5bfe15bdd3bcdee72266b8056787e6dcd0042418f3cc91411370f7e1a not found: ID does not exist" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.383459 4903 scope.go:117] "RemoveContainer" containerID="2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7" Dec 02 23:17:52 crc kubenswrapper[4903]: E1202 23:17:52.383893 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7\": container with ID starting with 2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7 not found: ID does not exist" containerID="2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.383931 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7"} err="failed to get container status \"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7\": rpc error: code = NotFound desc = could not find container \"2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7\": container with ID starting with 2565f28b6d412237a204f7557b2e587c0e20d98299a180ec1bb42b6b1a38f5b7 not found: ID does not exist" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.383960 4903 scope.go:117] "RemoveContainer" containerID="603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5" Dec 02 23:17:52 crc kubenswrapper[4903]: E1202 23:17:52.384286 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5\": container with ID starting with 603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5 not found: ID does not exist" containerID="603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.384316 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5"} err="failed to get container status \"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5\": rpc error: code = NotFound desc = could not find container \"603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5\": container with ID starting with 603ea21ba5e80b2cea12022574c1b146204de9c1865f910639462d68e5c880f5 not found: ID does not exist" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.384330 4903 scope.go:117] "RemoveContainer" containerID="b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b" Dec 02 23:17:52 crc kubenswrapper[4903]: E1202 23:17:52.384693 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b\": container with ID starting with b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b not found: ID does not exist" containerID="b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.384718 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b"} err="failed to get container status \"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b\": rpc error: code = NotFound desc = could not find container \"b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b\": container with ID starting with b2956aa78e92818d64de5d67f0c0517f302aaf63f55ca8762f89d1a5d077363b not found: ID does not exist" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397348 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397417 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397493 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397518 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397622 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp6fc\" (UniqueName: \"kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397666 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.397745 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle\") pod \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\" (UID: \"8a0cf0be-7840-4684-ada4-16ea2d6351f3\") " Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.399050 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.399209 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.410682 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc" (OuterVolumeSpecName: "kube-api-access-lp6fc") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "kube-api-access-lp6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.411321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts" (OuterVolumeSpecName: "scripts") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.449388 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.499794 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.499818 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.499827 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.499837 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp6fc\" (UniqueName: \"kubernetes.io/projected/8a0cf0be-7840-4684-ada4-16ea2d6351f3-kube-api-access-lp6fc\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.499848 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0cf0be-7840-4684-ada4-16ea2d6351f3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.517227 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.533596 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data" (OuterVolumeSpecName: "config-data") pod "8a0cf0be-7840-4684-ada4-16ea2d6351f3" (UID: "8a0cf0be-7840-4684-ada4-16ea2d6351f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.601619 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:52 crc kubenswrapper[4903]: I1202 23:17:52.601666 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0cf0be-7840-4684-ada4-16ea2d6351f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.282265 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.322804 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.342518 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358023 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358462 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="proxy-httpd" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358475 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="proxy-httpd" Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358491 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="sg-core" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358496 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="sg-core" Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358510 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-central-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358516 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-central-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358528 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358534 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358564 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon-log" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358569 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon-log" Dec 02 23:17:53 crc kubenswrapper[4903]: E1202 23:17:53.358588 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-notification-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358594 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-notification-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358805 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-central-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358843 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon-log" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358857 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5f4367-359b-4633-80f9-0af5ac406aa4" containerName="horizon" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358867 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="sg-core" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358876 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="proxy-httpd" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.358884 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" containerName="ceilometer-notification-agent" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.361203 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.365063 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.365331 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.399680 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418371 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418417 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjsp\" (UniqueName: \"kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418485 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418512 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418536 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418575 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.418608 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520377 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520480 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjsp\" (UniqueName: \"kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520592 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520621 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520674 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.520719 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.521274 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.521311 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.528551 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.528898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.533942 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.534886 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.536614 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjsp\" (UniqueName: \"kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp\") pod \"ceilometer-0\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " pod="openstack/ceilometer-0" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.622663 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0cf0be-7840-4684-ada4-16ea2d6351f3" path="/var/lib/kubelet/pods/8a0cf0be-7840-4684-ada4-16ea2d6351f3/volumes" Dec 02 23:17:53 crc kubenswrapper[4903]: I1202 23:17:53.700528 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:17:54 crc kubenswrapper[4903]: I1202 23:17:54.125357 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:54 crc kubenswrapper[4903]: I1202 23:17:54.292449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerStarted","Data":"5a5588c1cc9c7aa4d76ef2987b969ff1c9fb1aacc3baa011e1796d132f4dc820"} Dec 02 23:17:55 crc kubenswrapper[4903]: I1202 23:17:55.302613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerStarted","Data":"45324c6f30d4f7ac354566dc3d72217370256bbd0850d0f03b353236c101c627"} Dec 02 23:17:56 crc kubenswrapper[4903]: I1202 23:17:56.361948 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:17:56 crc kubenswrapper[4903]: I1202 23:17:56.571055 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 23:17:59 crc kubenswrapper[4903]: I1202 23:17:59.283900 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:17:59 crc kubenswrapper[4903]: I1202 23:17:59.286100 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5489fffdb5-zmhmz" Dec 02 23:18:01 crc kubenswrapper[4903]: I1202 23:18:01.372593 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerStarted","Data":"de91ba3e0b74fe94481602a63d8511f799e2c863cda5dd2922fa5ed622e03b0e"} Dec 02 23:18:01 crc kubenswrapper[4903]: I1202 23:18:01.373144 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerStarted","Data":"ff1ca97de4d16801a7986aa433dba7032d7f486a452eb5da4eb886c07111bf15"} Dec 02 23:18:01 crc kubenswrapper[4903]: I1202 23:18:01.374341 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"01e0132f-dfe4-4d3a-9a72-b38b77521ada","Type":"ContainerStarted","Data":"fc48c4438a8df1451f6d0e4b61c8143a27af5b5ba1e9efd3645dd71ad930bd93"} Dec 02 23:18:01 crc kubenswrapper[4903]: I1202 23:18:01.393264 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.698151517 podStartE2EDuration="14.393249776s" podCreationTimestamp="2025-12-02 23:17:47 +0000 UTC" firstStartedPulling="2025-12-02 23:17:48.603191244 +0000 UTC m=+1207.311745527" lastFinishedPulling="2025-12-02 23:18:00.298289503 +0000 UTC m=+1219.006843786" observedRunningTime="2025-12-02 23:18:01.392356206 +0000 UTC m=+1220.100910489" watchObservedRunningTime="2025-12-02 23:18:01.393249776 +0000 UTC m=+1220.101804059" Dec 02 23:18:03 crc kubenswrapper[4903]: I1202 23:18:03.393894 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerStarted","Data":"e6cbaa632ad42b04e901e5812d2828bd81ce289d44ba1404652adc0c4940048a"} Dec 02 23:18:03 crc kubenswrapper[4903]: I1202 23:18:03.612965 4903 scope.go:117] "RemoveContainer" containerID="ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005" Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.403756 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b"} Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.404159 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.404521 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="proxy-httpd" containerID="cri-o://e6cbaa632ad42b04e901e5812d2828bd81ce289d44ba1404652adc0c4940048a" gracePeriod=30 Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.404613 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="sg-core" containerID="cri-o://de91ba3e0b74fe94481602a63d8511f799e2c863cda5dd2922fa5ed622e03b0e" gracePeriod=30 Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.404680 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-notification-agent" containerID="cri-o://ff1ca97de4d16801a7986aa433dba7032d7f486a452eb5da4eb886c07111bf15" gracePeriod=30 Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.403869 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-central-agent" containerID="cri-o://45324c6f30d4f7ac354566dc3d72217370256bbd0850d0f03b353236c101c627" gracePeriod=30 Dec 02 23:18:04 crc kubenswrapper[4903]: I1202 23:18:04.459011 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.964214668 podStartE2EDuration="11.458993345s" podCreationTimestamp="2025-12-02 23:17:53 +0000 UTC" firstStartedPulling="2025-12-02 23:17:54.132426378 +0000 UTC m=+1212.840980661" lastFinishedPulling="2025-12-02 23:18:02.627205035 +0000 UTC m=+1221.335759338" observedRunningTime="2025-12-02 23:18:04.450300681 +0000 UTC m=+1223.158854984" watchObservedRunningTime="2025-12-02 23:18:04.458993345 +0000 UTC m=+1223.167547638" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.423917 4903 generic.go:334] "Generic (PLEG): container finished" podID="124cecc2-bd78-4b80-926d-05e10500c940" containerID="e6cbaa632ad42b04e901e5812d2828bd81ce289d44ba1404652adc0c4940048a" exitCode=0 Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424332 4903 generic.go:334] "Generic (PLEG): container finished" podID="124cecc2-bd78-4b80-926d-05e10500c940" containerID="de91ba3e0b74fe94481602a63d8511f799e2c863cda5dd2922fa5ed622e03b0e" exitCode=2 Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424352 4903 generic.go:334] "Generic (PLEG): container finished" podID="124cecc2-bd78-4b80-926d-05e10500c940" containerID="ff1ca97de4d16801a7986aa433dba7032d7f486a452eb5da4eb886c07111bf15" exitCode=0 Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424366 4903 generic.go:334] "Generic (PLEG): container finished" podID="124cecc2-bd78-4b80-926d-05e10500c940" containerID="45324c6f30d4f7ac354566dc3d72217370256bbd0850d0f03b353236c101c627" exitCode=0 Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424104 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerDied","Data":"e6cbaa632ad42b04e901e5812d2828bd81ce289d44ba1404652adc0c4940048a"} Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424420 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerDied","Data":"de91ba3e0b74fe94481602a63d8511f799e2c863cda5dd2922fa5ed622e03b0e"} Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424445 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerDied","Data":"ff1ca97de4d16801a7986aa433dba7032d7f486a452eb5da4eb886c07111bf15"} Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.424462 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerDied","Data":"45324c6f30d4f7ac354566dc3d72217370256bbd0850d0f03b353236c101c627"} Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.742275 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.866492 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.867006 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.867629 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.868102 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.868162 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.868222 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnjsp\" (UniqueName: \"kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.868270 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.868295 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle\") pod \"124cecc2-bd78-4b80-926d-05e10500c940\" (UID: \"124cecc2-bd78-4b80-926d-05e10500c940\") " Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.869066 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.869513 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.869608 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/124cecc2-bd78-4b80-926d-05e10500c940-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.875567 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp" (OuterVolumeSpecName: "kube-api-access-wnjsp") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "kube-api-access-wnjsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.886783 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts" (OuterVolumeSpecName: "scripts") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.896616 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.971146 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.971174 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnjsp\" (UniqueName: \"kubernetes.io/projected/124cecc2-bd78-4b80-926d-05e10500c940-kube-api-access-wnjsp\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.971185 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:05 crc kubenswrapper[4903]: I1202 23:18:05.973327 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.003941 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data" (OuterVolumeSpecName: "config-data") pod "124cecc2-bd78-4b80-926d-05e10500c940" (UID: "124cecc2-bd78-4b80-926d-05e10500c940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.072676 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.072712 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124cecc2-bd78-4b80-926d-05e10500c940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.435394 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"124cecc2-bd78-4b80-926d-05e10500c940","Type":"ContainerDied","Data":"5a5588c1cc9c7aa4d76ef2987b969ff1c9fb1aacc3baa011e1796d132f4dc820"} Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.435463 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.436341 4903 scope.go:117] "RemoveContainer" containerID="e6cbaa632ad42b04e901e5812d2828bd81ce289d44ba1404652adc0c4940048a" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.474740 4903 scope.go:117] "RemoveContainer" containerID="de91ba3e0b74fe94481602a63d8511f799e2c863cda5dd2922fa5ed622e03b0e" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.477137 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.492535 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.503022 4903 scope.go:117] "RemoveContainer" containerID="ff1ca97de4d16801a7986aa433dba7032d7f486a452eb5da4eb886c07111bf15" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.513786 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:06 crc kubenswrapper[4903]: E1202 23:18:06.514221 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-notification-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514238 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-notification-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: E1202 23:18:06.514252 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-central-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514257 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-central-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: E1202 23:18:06.514274 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="sg-core" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514281 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="sg-core" Dec 02 23:18:06 crc kubenswrapper[4903]: E1202 23:18:06.514292 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="proxy-httpd" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514297 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="proxy-httpd" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514455 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-central-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514467 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="proxy-httpd" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514486 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="sg-core" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.514499 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="124cecc2-bd78-4b80-926d-05e10500c940" containerName="ceilometer-notification-agent" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.516176 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.518224 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.518465 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.545171 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.547080 4903 scope.go:117] "RemoveContainer" containerID="45324c6f30d4f7ac354566dc3d72217370256bbd0850d0f03b353236c101c627" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581399 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmjk\" (UniqueName: \"kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581454 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581539 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581569 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581607 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.581629 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.667161 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.667409 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" containerID="cri-o://ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315" gracePeriod=30 Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.667506 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" containerID="cri-o://4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9" gracePeriod=30 Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.674701 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": EOF" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.674715 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": EOF" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.674812 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": EOF" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.677769 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.183:9292/healthcheck\": EOF" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683693 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683780 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683813 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683873 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmjk\" (UniqueName: \"kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683922 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.683940 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.684033 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.684439 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.684881 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.691531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.691590 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.694227 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.695540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.704509 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmjk\" (UniqueName: \"kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk\") pod \"ceilometer-0\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " pod="openstack/ceilometer-0" Dec 02 23:18:06 crc kubenswrapper[4903]: I1202 23:18:06.846983 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.310905 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:07 crc kubenswrapper[4903]: W1202 23:18:07.314505 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod698e25ed_bf30_48be_9bf4_70e25028db42.slice/crio-b1107120c46cc7249bd674883df7b4cb8b60993ca2765741a9c8ff4639915c40 WatchSource:0}: Error finding container b1107120c46cc7249bd674883df7b4cb8b60993ca2765741a9c8ff4639915c40: Status 404 returned error can't find the container with id b1107120c46cc7249bd674883df7b4cb8b60993ca2765741a9c8ff4639915c40 Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.446980 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerStarted","Data":"b1107120c46cc7249bd674883df7b4cb8b60993ca2765741a9c8ff4639915c40"} Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.449000 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" exitCode=1 Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.449631 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b"} Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.449690 4903 scope.go:117] "RemoveContainer" containerID="ba4ae74d9752d870e35c06dc4ab2dd97ed3bdb486401b2bdaed152da3cf06005" Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.450540 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:07 crc kubenswrapper[4903]: E1202 23:18:07.450823 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.455389 4903 generic.go:334] "Generic (PLEG): container finished" podID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerID="ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315" exitCode=143 Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.455443 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerDied","Data":"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315"} Dec 02 23:18:07 crc kubenswrapper[4903]: I1202 23:18:07.623329 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124cecc2-bd78-4b80-926d-05e10500c940" path="/var/lib/kubelet/pods/124cecc2-bd78-4b80-926d-05e10500c940/volumes" Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.466709 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerStarted","Data":"b34bb329ebd4049815779765fe2bf8b0738792c435d30e1ee4d2d4f62c1417ae"} Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.467050 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerStarted","Data":"0ee003a9c4075b41ff99c53342a77b5e00ba5430040f83809ac487befb7e4afc"} Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.966341 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.966601 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.966611 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.966626 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:08 crc kubenswrapper[4903]: I1202 23:18:08.967488 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:08 crc kubenswrapper[4903]: E1202 23:18:08.967995 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.402748 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.480923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerStarted","Data":"27424d0f01716f3f24add106b89b6aa7a2be58db580bf1259276f9fe7137749b"} Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.482611 4903 generic.go:334] "Generic (PLEG): container finished" podID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerID="4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9" exitCode=0 Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.482662 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.482670 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerDied","Data":"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9"} Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.482726 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7635ad7a-2665-4edb-a6d7-ae48268f599e","Type":"ContainerDied","Data":"c03c1703fe5fef3d372f4aac7b2a9862991178e92f0ad968573dc0e36ad313c6"} Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.482746 4903 scope.go:117] "RemoveContainer" containerID="4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.483614 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:09 crc kubenswrapper[4903]: E1202 23:18:09.483874 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.506956 4903 scope.go:117] "RemoveContainer" containerID="ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.526860 4903 scope.go:117] "RemoveContainer" containerID="4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9" Dec 02 23:18:09 crc kubenswrapper[4903]: E1202 23:18:09.527232 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9\": container with ID starting with 4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9 not found: ID does not exist" containerID="4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.527274 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9"} err="failed to get container status \"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9\": rpc error: code = NotFound desc = could not find container \"4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9\": container with ID starting with 4c664c2e09d3bbc1116634be3a2691daf7344e59f045ed5067eb94f4d948b8c9 not found: ID does not exist" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.527299 4903 scope.go:117] "RemoveContainer" containerID="ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315" Dec 02 23:18:09 crc kubenswrapper[4903]: E1202 23:18:09.527584 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315\": container with ID starting with ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315 not found: ID does not exist" containerID="ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.527614 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315"} err="failed to get container status \"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315\": rpc error: code = NotFound desc = could not find container \"ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315\": container with ID starting with ff6d3b7de6883b7315e655d1a66c8e2d925f7779b487fdbd753b31e384cd8315 not found: ID does not exist" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543346 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543380 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543467 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543496 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543527 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543592 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543669 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.543702 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9st8\" (UniqueName: \"kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8\") pod \"7635ad7a-2665-4edb-a6d7-ae48268f599e\" (UID: \"7635ad7a-2665-4edb-a6d7-ae48268f599e\") " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.544306 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.544410 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs" (OuterVolumeSpecName: "logs") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.544608 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.544623 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7635ad7a-2665-4edb-a6d7-ae48268f599e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.549092 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts" (OuterVolumeSpecName: "scripts") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.549701 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.550470 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8" (OuterVolumeSpecName: "kube-api-access-k9st8") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "kube-api-access-k9st8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.584900 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.599435 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.618736 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data" (OuterVolumeSpecName: "config-data") pod "7635ad7a-2665-4edb-a6d7-ae48268f599e" (UID: "7635ad7a-2665-4edb-a6d7-ae48268f599e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646323 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646373 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646383 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646393 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9st8\" (UniqueName: \"kubernetes.io/projected/7635ad7a-2665-4edb-a6d7-ae48268f599e-kube-api-access-k9st8\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646420 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.646444 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7635ad7a-2665-4edb-a6d7-ae48268f599e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.666695 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.748060 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.807339 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.815572 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.847410 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:09 crc kubenswrapper[4903]: E1202 23:18:09.847801 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.847817 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" Dec 02 23:18:09 crc kubenswrapper[4903]: E1202 23:18:09.847859 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.847865 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.848023 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-log" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.848050 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" containerName="glance-httpd" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.848983 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.851258 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.852193 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.870812 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951289 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951360 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951563 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951589 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-logs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951605 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsg4h\" (UniqueName: \"kubernetes.io/projected/ddc0e105-7645-48dc-9450-661c4ca40b01-kube-api-access-bsg4h\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:09 crc kubenswrapper[4903]: I1202 23:18:09.951750 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053417 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053588 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053610 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053644 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053687 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-logs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053709 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsg4h\" (UniqueName: \"kubernetes.io/projected/ddc0e105-7645-48dc-9450-661c4ca40b01-kube-api-access-bsg4h\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.053747 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.054292 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.054341 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.054581 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddc0e105-7645-48dc-9450-661c4ca40b01-logs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.059241 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.061553 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.065279 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.073787 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsg4h\" (UniqueName: \"kubernetes.io/projected/ddc0e105-7645-48dc-9450-661c4ca40b01-kube-api-access-bsg4h\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.073856 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc0e105-7645-48dc-9450-661c4ca40b01-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.093240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ddc0e105-7645-48dc-9450-661c4ca40b01\") " pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.164156 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.495009 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerStarted","Data":"7e71cc9a8f4e443caa113bff29adeb51b3d24d46135d95e078b2969540399f26"} Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.496337 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.518583 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.085607494 podStartE2EDuration="4.518559642s" podCreationTimestamp="2025-12-02 23:18:06 +0000 UTC" firstStartedPulling="2025-12-02 23:18:07.31695852 +0000 UTC m=+1226.025512803" lastFinishedPulling="2025-12-02 23:18:09.749910658 +0000 UTC m=+1228.458464951" observedRunningTime="2025-12-02 23:18:10.515509979 +0000 UTC m=+1229.224064282" watchObservedRunningTime="2025-12-02 23:18:10.518559642 +0000 UTC m=+1229.227113925" Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.678026 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.678466 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-log" containerID="cri-o://0a295593ab5c744e75bf7437de07c7f3e8c3fd2d4c8c0c48d9b8748df5760742" gracePeriod=30 Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.678513 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-httpd" containerID="cri-o://76db4ca849eca8c661ab292eedfd14008171a54a0747348725127fd0cb9f6171" gracePeriod=30 Dec 02 23:18:10 crc kubenswrapper[4903]: I1202 23:18:10.725796 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.015848 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.513040 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ddc0e105-7645-48dc-9450-661c4ca40b01","Type":"ContainerStarted","Data":"10bbf6ea48c88b0d888977f220a3a6df8f60ca15af619f54a19a5199f72f728b"} Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.513431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ddc0e105-7645-48dc-9450-661c4ca40b01","Type":"ContainerStarted","Data":"64c4882f03833e44c16bfa4feb7334030adf7bccf1d59f5ca502086002414bd2"} Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.523422 4903 generic.go:334] "Generic (PLEG): container finished" podID="67d56d30-2369-440d-8e83-2c424e0a79af" containerID="0a295593ab5c744e75bf7437de07c7f3e8c3fd2d4c8c0c48d9b8748df5760742" exitCode=143 Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.523806 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerDied","Data":"0a295593ab5c744e75bf7437de07c7f3e8c3fd2d4c8c0c48d9b8748df5760742"} Dec 02 23:18:11 crc kubenswrapper[4903]: I1202 23:18:11.624787 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7635ad7a-2665-4edb-a6d7-ae48268f599e" path="/var/lib/kubelet/pods/7635ad7a-2665-4edb-a6d7-ae48268f599e/volumes" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.339792 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tmcbs"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.341230 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.353763 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tmcbs"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.401841 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcl9h\" (UniqueName: \"kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.402011 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.460575 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dbf6h"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.467013 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.507473 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkqw\" (UniqueName: \"kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.507568 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.507615 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.507774 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcl9h\" (UniqueName: \"kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.511809 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dbf6h"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.514902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.525707 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b933-account-create-update-ds9hn"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.530143 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.536768 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b933-account-create-update-ds9hn"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.539136 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.539849 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcl9h\" (UniqueName: \"kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h\") pod \"nova-api-db-create-tmcbs\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.551767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ddc0e105-7645-48dc-9450-661c4ca40b01","Type":"ContainerStarted","Data":"3fdc74638814009de3bd1fb3981dac29b06e6501e6e2d2123f18ed7259e3bba5"} Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.572132 4903 generic.go:334] "Generic (PLEG): container finished" podID="67d56d30-2369-440d-8e83-2c424e0a79af" containerID="76db4ca849eca8c661ab292eedfd14008171a54a0747348725127fd0cb9f6171" exitCode=0 Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.572454 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-central-agent" containerID="cri-o://0ee003a9c4075b41ff99c53342a77b5e00ba5430040f83809ac487befb7e4afc" gracePeriod=30 Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.572774 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerDied","Data":"76db4ca849eca8c661ab292eedfd14008171a54a0747348725127fd0cb9f6171"} Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.573062 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="proxy-httpd" containerID="cri-o://7e71cc9a8f4e443caa113bff29adeb51b3d24d46135d95e078b2969540399f26" gracePeriod=30 Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.573121 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="sg-core" containerID="cri-o://27424d0f01716f3f24add106b89b6aa7a2be58db580bf1259276f9fe7137749b" gracePeriod=30 Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.573156 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-notification-agent" containerID="cri-o://b34bb329ebd4049815779765fe2bf8b0738792c435d30e1ee4d2d4f62c1417ae" gracePeriod=30 Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.588814 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8krqs"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.590923 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.598959 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8krqs"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613267 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xf6\" (UniqueName: \"kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613461 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkqw\" (UniqueName: \"kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613498 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613551 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613584 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg82f\" (UniqueName: \"kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.613641 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.615354 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.615335685 podStartE2EDuration="3.615335685s" podCreationTimestamp="2025-12-02 23:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:18:12.567717903 +0000 UTC m=+1231.276272186" watchObservedRunningTime="2025-12-02 23:18:12.615335685 +0000 UTC m=+1231.323889968" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.622465 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.643349 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkqw\" (UniqueName: \"kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw\") pod \"nova-cell0-db-create-dbf6h\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.654495 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bd18-account-create-update-hzb44"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.655693 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.657522 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.657978 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.669047 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bd18-account-create-update-hzb44"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719300 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719367 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg82f\" (UniqueName: \"kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719391 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719470 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5x9\" (UniqueName: \"kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719561 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xf6\" (UniqueName: \"kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.719636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.721353 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.722150 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.756405 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xf6\" (UniqueName: \"kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6\") pod \"nova-api-b933-account-create-update-ds9hn\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.757255 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg82f\" (UniqueName: \"kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f\") pod \"nova-cell1-db-create-8krqs\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.760548 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7e77-account-create-update-gvv2z"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.761792 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.762846 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.768521 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.778173 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e77-account-create-update-gvv2z"] Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.829517 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.830864 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.830933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.830997 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4m9v\" (UniqueName: \"kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.831031 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5x9\" (UniqueName: \"kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.831610 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.852362 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5x9\" (UniqueName: \"kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9\") pod \"nova-cell0-bd18-account-create-update-hzb44\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.927428 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.933859 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.933948 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4m9v\" (UniqueName: \"kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.934724 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.953603 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4m9v\" (UniqueName: \"kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v\") pod \"nova-cell1-7e77-account-create-update-gvv2z\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:12 crc kubenswrapper[4903]: I1202 23:18:12.967141 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034800 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034865 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034918 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034942 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4qw\" (UniqueName: \"kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034961 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.034979 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.035027 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.035052 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run\") pod \"67d56d30-2369-440d-8e83-2c424e0a79af\" (UID: \"67d56d30-2369-440d-8e83-2c424e0a79af\") " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.035760 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.037948 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs" (OuterVolumeSpecName: "logs") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.040521 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw" (OuterVolumeSpecName: "kube-api-access-td4qw") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "kube-api-access-td4qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.042932 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.045746 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts" (OuterVolumeSpecName: "scripts") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.101360 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.106566 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.118607 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137059 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137091 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137101 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137112 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4qw\" (UniqueName: \"kubernetes.io/projected/67d56d30-2369-440d-8e83-2c424e0a79af-kube-api-access-td4qw\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137121 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137129 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67d56d30-2369-440d-8e83-2c424e0a79af-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.137156 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.139659 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.157698 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data" (OuterVolumeSpecName: "config-data") pod "67d56d30-2369-440d-8e83-2c424e0a79af" (UID: "67d56d30-2369-440d-8e83-2c424e0a79af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.171807 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.223781 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tmcbs"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.238528 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67d56d30-2369-440d-8e83-2c424e0a79af-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.238548 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.397881 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8krqs"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.455312 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dbf6h"] Dec 02 23:18:13 crc kubenswrapper[4903]: W1202 23:18:13.462410 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e431bf_fdd1_48de_bccc_ea5b78e37e1a.slice/crio-01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945 WatchSource:0}: Error finding container 01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945: Status 404 returned error can't find the container with id 01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945 Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.587973 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b933-account-create-update-ds9hn"] Dec 02 23:18:13 crc kubenswrapper[4903]: W1202 23:18:13.591902 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5159b672_4f54_4a9d_a658_bee025a03797.slice/crio-918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c WatchSource:0}: Error finding container 918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c: Status 404 returned error can't find the container with id 918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.592222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmcbs" event={"ID":"4e227b00-37a8-409c-916d-3f6d49661795","Type":"ContainerStarted","Data":"09f85e7fba69c9268740ef51063c02d51d631cddd70a7cf69eece3db5c7dcbe8"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.592260 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmcbs" event={"ID":"4e227b00-37a8-409c-916d-3f6d49661795","Type":"ContainerStarted","Data":"fc882e0edf639052129a444c4c9148b3a01e07d7239f0c5df714e24ad57e69a6"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.600393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dbf6h" event={"ID":"68e431bf-fdd1-48de-bccc-ea5b78e37e1a","Type":"ContainerStarted","Data":"01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.610938 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-tmcbs" podStartSLOduration=1.610920444 podStartE2EDuration="1.610920444s" podCreationTimestamp="2025-12-02 23:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:18:13.607383949 +0000 UTC m=+1232.315938222" watchObservedRunningTime="2025-12-02 23:18:13.610920444 +0000 UTC m=+1232.319474727" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611155 4903 generic.go:334] "Generic (PLEG): container finished" podID="698e25ed-bf30-48be-9bf4-70e25028db42" containerID="7e71cc9a8f4e443caa113bff29adeb51b3d24d46135d95e078b2969540399f26" exitCode=0 Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611184 4903 generic.go:334] "Generic (PLEG): container finished" podID="698e25ed-bf30-48be-9bf4-70e25028db42" containerID="27424d0f01716f3f24add106b89b6aa7a2be58db580bf1259276f9fe7137749b" exitCode=2 Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611193 4903 generic.go:334] "Generic (PLEG): container finished" podID="698e25ed-bf30-48be-9bf4-70e25028db42" containerID="b34bb329ebd4049815779765fe2bf8b0738792c435d30e1ee4d2d4f62c1417ae" exitCode=0 Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611229 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerDied","Data":"7e71cc9a8f4e443caa113bff29adeb51b3d24d46135d95e078b2969540399f26"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611254 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerDied","Data":"27424d0f01716f3f24add106b89b6aa7a2be58db580bf1259276f9fe7137749b"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.611266 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerDied","Data":"b34bb329ebd4049815779765fe2bf8b0738792c435d30e1ee4d2d4f62c1417ae"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.613354 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.623266 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67d56d30-2369-440d-8e83-2c424e0a79af","Type":"ContainerDied","Data":"7608660724b5677ddfe1e7930d9e7fcc911c197c2d4f807be6b36834d7da1bc1"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.623307 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8krqs" event={"ID":"e2acd090-402b-4468-ab99-f0c41a763812","Type":"ContainerStarted","Data":"df54bbf895cf3991bf0dbbd7c2c05495c7434b57eef3fa810ad114ee62e9783b"} Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.623326 4903 scope.go:117] "RemoveContainer" containerID="76db4ca849eca8c661ab292eedfd14008171a54a0747348725127fd0cb9f6171" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.681607 4903 scope.go:117] "RemoveContainer" containerID="0a295593ab5c744e75bf7437de07c7f3e8c3fd2d4c8c0c48d9b8748df5760742" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.708389 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bd18-account-create-update-hzb44"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.731787 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e77-account-create-update-gvv2z"] Dec 02 23:18:13 crc kubenswrapper[4903]: W1202 23:18:13.737136 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d426b48_454f_49a9_8be0_2fea7237ff7c.slice/crio-48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a WatchSource:0}: Error finding container 48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a: Status 404 returned error can't find the container with id 48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.760407 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.811061 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.821610 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:13 crc kubenswrapper[4903]: E1202 23:18:13.822321 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-log" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.822339 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-log" Dec 02 23:18:13 crc kubenswrapper[4903]: E1202 23:18:13.822383 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-httpd" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.822390 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-httpd" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.822719 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-httpd" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.822770 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" containerName="glance-log" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.825139 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.827586 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.827910 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.840037 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.969810 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.969881 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970077 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970221 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jscgg\" (UniqueName: \"kubernetes.io/projected/3b2ebc8f-392e-4650-a033-a23cbe91436e-kube-api-access-jscgg\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970458 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:13 crc kubenswrapper[4903]: I1202 23:18:13.970503 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.072600 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jscgg\" (UniqueName: \"kubernetes.io/projected/3b2ebc8f-392e-4650-a033-a23cbe91436e-kube-api-access-jscgg\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.072908 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.072940 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.072969 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073006 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073048 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073064 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073116 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073623 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.073843 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.074135 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b2ebc8f-392e-4650-a033-a23cbe91436e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.078420 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.078779 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.079096 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.079704 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b2ebc8f-392e-4650-a033-a23cbe91436e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.095077 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jscgg\" (UniqueName: \"kubernetes.io/projected/3b2ebc8f-392e-4650-a033-a23cbe91436e-kube-api-access-jscgg\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.127197 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3b2ebc8f-392e-4650-a033-a23cbe91436e\") " pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.153586 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.633818 4903 generic.go:334] "Generic (PLEG): container finished" podID="68e431bf-fdd1-48de-bccc-ea5b78e37e1a" containerID="fa1c51ff21fc1b98f42c9d5dfa0bd48272f3369c37eb8fd63c01cc20e31852e7" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.633876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dbf6h" event={"ID":"68e431bf-fdd1-48de-bccc-ea5b78e37e1a","Type":"ContainerDied","Data":"fa1c51ff21fc1b98f42c9d5dfa0bd48272f3369c37eb8fd63c01cc20e31852e7"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.635819 4903 generic.go:334] "Generic (PLEG): container finished" podID="1d426b48-454f-49a9-8be0-2fea7237ff7c" containerID="c61fcdcc06287d81c4c5fe20e7579184599004e4db4b0c1426445f36df8ea73f" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.635872 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" event={"ID":"1d426b48-454f-49a9-8be0-2fea7237ff7c","Type":"ContainerDied","Data":"c61fcdcc06287d81c4c5fe20e7579184599004e4db4b0c1426445f36df8ea73f"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.635899 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" event={"ID":"1d426b48-454f-49a9-8be0-2fea7237ff7c","Type":"ContainerStarted","Data":"48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.637613 4903 generic.go:334] "Generic (PLEG): container finished" podID="5159b672-4f54-4a9d-a658-bee025a03797" containerID="d1518b53fd06b9129dac97a57713df93493c07251bdf5f6b59b90c7a977c583b" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.637694 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b933-account-create-update-ds9hn" event={"ID":"5159b672-4f54-4a9d-a658-bee025a03797","Type":"ContainerDied","Data":"d1518b53fd06b9129dac97a57713df93493c07251bdf5f6b59b90c7a977c583b"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.637763 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b933-account-create-update-ds9hn" event={"ID":"5159b672-4f54-4a9d-a658-bee025a03797","Type":"ContainerStarted","Data":"918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.641102 4903 generic.go:334] "Generic (PLEG): container finished" podID="eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" containerID="a1a8ac2e73f6ddeab0eecb16ff1ff5a24d2e1c62c7b288a33df9f8022e629187" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.641148 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" event={"ID":"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8","Type":"ContainerDied","Data":"a1a8ac2e73f6ddeab0eecb16ff1ff5a24d2e1c62c7b288a33df9f8022e629187"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.641169 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" event={"ID":"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8","Type":"ContainerStarted","Data":"99d35082c271fd380b0831690f0561701724cae6147fb237085a458142820c6c"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.646303 4903 generic.go:334] "Generic (PLEG): container finished" podID="e2acd090-402b-4468-ab99-f0c41a763812" containerID="d386d1ae453579c83224d3f8ba3eb5222349c9b28c054ea10002429664d26afd" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.646334 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8krqs" event={"ID":"e2acd090-402b-4468-ab99-f0c41a763812","Type":"ContainerDied","Data":"d386d1ae453579c83224d3f8ba3eb5222349c9b28c054ea10002429664d26afd"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.650492 4903 generic.go:334] "Generic (PLEG): container finished" podID="4e227b00-37a8-409c-916d-3f6d49661795" containerID="09f85e7fba69c9268740ef51063c02d51d631cddd70a7cf69eece3db5c7dcbe8" exitCode=0 Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.650551 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmcbs" event={"ID":"4e227b00-37a8-409c-916d-3f6d49661795","Type":"ContainerDied","Data":"09f85e7fba69c9268740ef51063c02d51d631cddd70a7cf69eece3db5c7dcbe8"} Dec 02 23:18:14 crc kubenswrapper[4903]: I1202 23:18:14.739353 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 23:18:15 crc kubenswrapper[4903]: I1202 23:18:15.633905 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d56d30-2369-440d-8e83-2c424e0a79af" path="/var/lib/kubelet/pods/67d56d30-2369-440d-8e83-2c424e0a79af/volumes" Dec 02 23:18:15 crc kubenswrapper[4903]: I1202 23:18:15.688733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b2ebc8f-392e-4650-a033-a23cbe91436e","Type":"ContainerStarted","Data":"bcf749291e3c0bf788cc773c188622a1f0996b1dd95195ec2ebc39ee8bdef592"} Dec 02 23:18:15 crc kubenswrapper[4903]: I1202 23:18:15.688789 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b2ebc8f-392e-4650-a033-a23cbe91436e","Type":"ContainerStarted","Data":"92ae5c187067185951e6e794bf2bdc4b3df0d32ce57a719cdd00cc323cd45cc5"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.112701 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.230821 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xf6\" (UniqueName: \"kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6\") pod \"5159b672-4f54-4a9d-a658-bee025a03797\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.231445 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts\") pod \"5159b672-4f54-4a9d-a658-bee025a03797\" (UID: \"5159b672-4f54-4a9d-a658-bee025a03797\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.233816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5159b672-4f54-4a9d-a658-bee025a03797" (UID: "5159b672-4f54-4a9d-a658-bee025a03797"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.238053 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6" (OuterVolumeSpecName: "kube-api-access-68xf6") pod "5159b672-4f54-4a9d-a658-bee025a03797" (UID: "5159b672-4f54-4a9d-a658-bee025a03797"). InnerVolumeSpecName "kube-api-access-68xf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.334665 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xf6\" (UniqueName: \"kubernetes.io/projected/5159b672-4f54-4a9d-a658-bee025a03797-kube-api-access-68xf6\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.334698 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5159b672-4f54-4a9d-a658-bee025a03797-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.358250 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.370738 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.371027 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.386861 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.388784 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.435699 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts\") pod \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.436127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68e431bf-fdd1-48de-bccc-ea5b78e37e1a" (UID: "68e431bf-fdd1-48de-bccc-ea5b78e37e1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.436295 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drkqw\" (UniqueName: \"kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw\") pod \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\" (UID: \"68e431bf-fdd1-48de-bccc-ea5b78e37e1a\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.436761 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.439448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw" (OuterVolumeSpecName: "kube-api-access-drkqw") pod "68e431bf-fdd1-48de-bccc-ea5b78e37e1a" (UID: "68e431bf-fdd1-48de-bccc-ea5b78e37e1a"). InnerVolumeSpecName "kube-api-access-drkqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538074 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts\") pod \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538315 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts\") pod \"1d426b48-454f-49a9-8be0-2fea7237ff7c\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538437 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcl9h\" (UniqueName: \"kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h\") pod \"4e227b00-37a8-409c-916d-3f6d49661795\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538600 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts\") pod \"e2acd090-402b-4468-ab99-f0c41a763812\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538742 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg82f\" (UniqueName: \"kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f\") pod \"e2acd090-402b-4468-ab99-f0c41a763812\" (UID: \"e2acd090-402b-4468-ab99-f0c41a763812\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538901 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4m9v\" (UniqueName: \"kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v\") pod \"1d426b48-454f-49a9-8be0-2fea7237ff7c\" (UID: \"1d426b48-454f-49a9-8be0-2fea7237ff7c\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.539018 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts\") pod \"4e227b00-37a8-409c-916d-3f6d49661795\" (UID: \"4e227b00-37a8-409c-916d-3f6d49661795\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538592 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" (UID: "eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538631 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d426b48-454f-49a9-8be0-2fea7237ff7c" (UID: "1d426b48-454f-49a9-8be0-2fea7237ff7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.538836 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2acd090-402b-4468-ab99-f0c41a763812" (UID: "e2acd090-402b-4468-ab99-f0c41a763812"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.539359 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h5x9\" (UniqueName: \"kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9\") pod \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\" (UID: \"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8\") " Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.540324 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2acd090-402b-4468-ab99-f0c41a763812-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.540411 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drkqw\" (UniqueName: \"kubernetes.io/projected/68e431bf-fdd1-48de-bccc-ea5b78e37e1a-kube-api-access-drkqw\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.540887 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.540977 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d426b48-454f-49a9-8be0-2fea7237ff7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.539798 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e227b00-37a8-409c-916d-3f6d49661795" (UID: "4e227b00-37a8-409c-916d-3f6d49661795"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.543075 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h" (OuterVolumeSpecName: "kube-api-access-zcl9h") pod "4e227b00-37a8-409c-916d-3f6d49661795" (UID: "4e227b00-37a8-409c-916d-3f6d49661795"). InnerVolumeSpecName "kube-api-access-zcl9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.543775 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f" (OuterVolumeSpecName: "kube-api-access-kg82f") pod "e2acd090-402b-4468-ab99-f0c41a763812" (UID: "e2acd090-402b-4468-ab99-f0c41a763812"). InnerVolumeSpecName "kube-api-access-kg82f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.544223 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9" (OuterVolumeSpecName: "kube-api-access-9h5x9") pod "eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" (UID: "eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8"). InnerVolumeSpecName "kube-api-access-9h5x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.550961 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v" (OuterVolumeSpecName: "kube-api-access-t4m9v") pod "1d426b48-454f-49a9-8be0-2fea7237ff7c" (UID: "1d426b48-454f-49a9-8be0-2fea7237ff7c"). InnerVolumeSpecName "kube-api-access-t4m9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.642903 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcl9h\" (UniqueName: \"kubernetes.io/projected/4e227b00-37a8-409c-916d-3f6d49661795-kube-api-access-zcl9h\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.642944 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg82f\" (UniqueName: \"kubernetes.io/projected/e2acd090-402b-4468-ab99-f0c41a763812-kube-api-access-kg82f\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.642957 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4m9v\" (UniqueName: \"kubernetes.io/projected/1d426b48-454f-49a9-8be0-2fea7237ff7c-kube-api-access-t4m9v\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.642970 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e227b00-37a8-409c-916d-3f6d49661795-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.642983 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h5x9\" (UniqueName: \"kubernetes.io/projected/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8-kube-api-access-9h5x9\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.697481 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b2ebc8f-392e-4650-a033-a23cbe91436e","Type":"ContainerStarted","Data":"44dc4f9e444ea72e5d6cde160c3d99e9bab4483193bb1c676a6f31774372ee24"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.698880 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" event={"ID":"eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8","Type":"ContainerDied","Data":"99d35082c271fd380b0831690f0561701724cae6147fb237085a458142820c6c"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.698908 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bd18-account-create-update-hzb44" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.698925 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d35082c271fd380b0831690f0561701724cae6147fb237085a458142820c6c" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.700135 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8krqs" event={"ID":"e2acd090-402b-4468-ab99-f0c41a763812","Type":"ContainerDied","Data":"df54bbf895cf3991bf0dbbd7c2c05495c7434b57eef3fa810ad114ee62e9783b"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.700173 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df54bbf895cf3991bf0dbbd7c2c05495c7434b57eef3fa810ad114ee62e9783b" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.700264 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8krqs" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.701556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmcbs" event={"ID":"4e227b00-37a8-409c-916d-3f6d49661795","Type":"ContainerDied","Data":"fc882e0edf639052129a444c4c9148b3a01e07d7239f0c5df714e24ad57e69a6"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.701583 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc882e0edf639052129a444c4c9148b3a01e07d7239f0c5df714e24ad57e69a6" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.701594 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmcbs" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.711930 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dbf6h" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.712057 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dbf6h" event={"ID":"68e431bf-fdd1-48de-bccc-ea5b78e37e1a","Type":"ContainerDied","Data":"01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.712086 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01be54ad878a878952abbc26e6d2424ad753d94e820529f3154c9dca24fca945" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.716995 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" event={"ID":"1d426b48-454f-49a9-8be0-2fea7237ff7c","Type":"ContainerDied","Data":"48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.717040 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b5dff83ab6226c249157f16db275651a07949f71fde111b76d388a48e96f4a" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.717316 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e77-account-create-update-gvv2z" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.718834 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b933-account-create-update-ds9hn" event={"ID":"5159b672-4f54-4a9d-a658-bee025a03797","Type":"ContainerDied","Data":"918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c"} Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.718857 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918bb8512b654f57c3cb365b38688892d9ac5be82f188c53ef8fa9a1fa4df48c" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.718902 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b933-account-create-update-ds9hn" Dec 02 23:18:16 crc kubenswrapper[4903]: I1202 23:18:16.731439 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.731424284 podStartE2EDuration="3.731424284s" podCreationTimestamp="2025-12-02 23:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:18:16.724821454 +0000 UTC m=+1235.433375737" watchObservedRunningTime="2025-12-02 23:18:16.731424284 +0000 UTC m=+1235.439978567" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.914244 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hjz85"] Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915043 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e227b00-37a8-409c-916d-3f6d49661795" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915057 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e227b00-37a8-409c-916d-3f6d49661795" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915086 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5159b672-4f54-4a9d-a658-bee025a03797" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915094 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5159b672-4f54-4a9d-a658-bee025a03797" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915112 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915120 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915130 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2acd090-402b-4468-ab99-f0c41a763812" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915138 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2acd090-402b-4468-ab99-f0c41a763812" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915173 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e431bf-fdd1-48de-bccc-ea5b78e37e1a" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915181 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e431bf-fdd1-48de-bccc-ea5b78e37e1a" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: E1202 23:18:17.915194 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d426b48-454f-49a9-8be0-2fea7237ff7c" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915201 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d426b48-454f-49a9-8be0-2fea7237ff7c" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915399 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2acd090-402b-4468-ab99-f0c41a763812" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915412 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915426 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5159b672-4f54-4a9d-a658-bee025a03797" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915446 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e431bf-fdd1-48de-bccc-ea5b78e37e1a" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915464 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e227b00-37a8-409c-916d-3f6d49661795" containerName="mariadb-database-create" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.915482 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d426b48-454f-49a9-8be0-2fea7237ff7c" containerName="mariadb-account-create-update" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.916226 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.920354 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.920477 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zl2fh" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.921219 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 23:18:17 crc kubenswrapper[4903]: I1202 23:18:17.928936 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hjz85"] Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.072200 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.072268 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.072381 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5cb\" (UniqueName: \"kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.072545 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.174810 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.174876 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.174929 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.175001 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5cb\" (UniqueName: \"kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.180012 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.180741 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.181362 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.197801 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5cb\" (UniqueName: \"kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb\") pod \"nova-cell0-conductor-db-sync-hjz85\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.233020 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:18 crc kubenswrapper[4903]: W1202 23:18:18.695443 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af62260_60e9_49b0_84b9_3f9cf7361c79.slice/crio-bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189 WatchSource:0}: Error finding container bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189: Status 404 returned error can't find the container with id bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189 Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.696289 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hjz85"] Dec 02 23:18:18 crc kubenswrapper[4903]: I1202 23:18:18.746543 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hjz85" event={"ID":"6af62260-60e9-49b0-84b9-3f9cf7361c79","Type":"ContainerStarted","Data":"bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189"} Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.164868 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.164909 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.203859 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.221897 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.769380 4903 generic.go:334] "Generic (PLEG): container finished" podID="698e25ed-bf30-48be-9bf4-70e25028db42" containerID="0ee003a9c4075b41ff99c53342a77b5e00ba5430040f83809ac487befb7e4afc" exitCode=0 Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.770362 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerDied","Data":"0ee003a9c4075b41ff99c53342a77b5e00ba5430040f83809ac487befb7e4afc"} Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.770879 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.770908 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 23:18:20 crc kubenswrapper[4903]: I1202 23:18:20.889677 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.034538 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.034828 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.034996 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.035013 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.035032 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsmjk\" (UniqueName: \"kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.035133 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.035770 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.035821 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd\") pod \"698e25ed-bf30-48be-9bf4-70e25028db42\" (UID: \"698e25ed-bf30-48be-9bf4-70e25028db42\") " Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.036553 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.036714 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.036729 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/698e25ed-bf30-48be-9bf4-70e25028db42-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.042013 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk" (OuterVolumeSpecName: "kube-api-access-tsmjk") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "kube-api-access-tsmjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.050790 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts" (OuterVolumeSpecName: "scripts") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.076837 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.127944 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.138772 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsmjk\" (UniqueName: \"kubernetes.io/projected/698e25ed-bf30-48be-9bf4-70e25028db42-kube-api-access-tsmjk\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.138801 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.138811 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.138822 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.176506 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data" (OuterVolumeSpecName: "config-data") pod "698e25ed-bf30-48be-9bf4-70e25028db42" (UID: "698e25ed-bf30-48be-9bf4-70e25028db42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.241112 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698e25ed-bf30-48be-9bf4-70e25028db42-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.619356 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:21 crc kubenswrapper[4903]: E1202 23:18:21.619558 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.784339 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"698e25ed-bf30-48be-9bf4-70e25028db42","Type":"ContainerDied","Data":"b1107120c46cc7249bd674883df7b4cb8b60993ca2765741a9c8ff4639915c40"} Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.784378 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.784418 4903 scope.go:117] "RemoveContainer" containerID="7e71cc9a8f4e443caa113bff29adeb51b3d24d46135d95e078b2969540399f26" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.808007 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.830055 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.846916 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:21 crc kubenswrapper[4903]: E1202 23:18:21.847323 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="proxy-httpd" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847341 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="proxy-httpd" Dec 02 23:18:21 crc kubenswrapper[4903]: E1202 23:18:21.847377 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="sg-core" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847384 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="sg-core" Dec 02 23:18:21 crc kubenswrapper[4903]: E1202 23:18:21.847398 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-central-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847406 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-central-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: E1202 23:18:21.847418 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-notification-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847424 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-notification-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847603 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-central-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847620 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="proxy-httpd" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847631 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="ceilometer-notification-agent" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.847662 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" containerName="sg-core" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.849243 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.849330 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.851643 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.851933 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968136 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968194 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968288 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968363 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968404 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968429 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:21 crc kubenswrapper[4903]: I1202 23:18:21.968480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjd8\" (UniqueName: \"kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070138 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070207 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070239 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070293 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjd8\" (UniqueName: \"kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070349 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.070371 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.071412 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.071505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.074731 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.087095 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.099396 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.099992 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.104064 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjd8\" (UniqueName: \"kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8\") pod \"ceilometer-0\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.206674 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.610687 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:18:22 crc kubenswrapper[4903]: I1202 23:18:22.612477 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 23:18:23 crc kubenswrapper[4903]: I1202 23:18:23.622951 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698e25ed-bf30-48be-9bf4-70e25028db42" path="/var/lib/kubelet/pods/698e25ed-bf30-48be-9bf4-70e25028db42/volumes" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.155164 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.155482 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.191236 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.206694 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.820988 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:24 crc kubenswrapper[4903]: I1202 23:18:24.821043 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:26 crc kubenswrapper[4903]: I1202 23:18:26.697752 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:26 crc kubenswrapper[4903]: I1202 23:18:26.699313 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 23:18:30 crc kubenswrapper[4903]: I1202 23:18:30.963341 4903 scope.go:117] "RemoveContainer" containerID="27424d0f01716f3f24add106b89b6aa7a2be58db580bf1259276f9fe7137749b" Dec 02 23:18:30 crc kubenswrapper[4903]: I1202 23:18:30.998098 4903 scope.go:117] "RemoveContainer" containerID="b34bb329ebd4049815779765fe2bf8b0738792c435d30e1ee4d2d4f62c1417ae" Dec 02 23:18:31 crc kubenswrapper[4903]: E1202 23:18:31.010991 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Dec 02 23:18:31 crc kubenswrapper[4903]: E1202 23:18:31.011039 4903 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.2:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Dec 02 23:18:31 crc kubenswrapper[4903]: E1202 23:18:31.011165 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:38.102.83.2:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mg5cb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-hjz85_openstack(6af62260-60e9-49b0-84b9-3f9cf7361c79): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 23:18:31 crc kubenswrapper[4903]: E1202 23:18:31.012628 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-hjz85" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" Dec 02 23:18:31 crc kubenswrapper[4903]: I1202 23:18:31.174575 4903 scope.go:117] "RemoveContainer" containerID="0ee003a9c4075b41ff99c53342a77b5e00ba5430040f83809ac487befb7e4afc" Dec 02 23:18:31 crc kubenswrapper[4903]: I1202 23:18:31.499330 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:31 crc kubenswrapper[4903]: I1202 23:18:31.509109 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:18:31 crc kubenswrapper[4903]: I1202 23:18:31.911043 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerStarted","Data":"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3"} Dec 02 23:18:31 crc kubenswrapper[4903]: I1202 23:18:31.911389 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerStarted","Data":"9f4f6a927f5f327382bfb6d9393c976dc8fe724192051ea26a8612a284a011d8"} Dec 02 23:18:31 crc kubenswrapper[4903]: E1202 23:18:31.914761 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.2:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-hjz85" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" Dec 02 23:18:32 crc kubenswrapper[4903]: I1202 23:18:32.612913 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:32 crc kubenswrapper[4903]: E1202 23:18:32.613776 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:32 crc kubenswrapper[4903]: I1202 23:18:32.922530 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerStarted","Data":"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41"} Dec 02 23:18:32 crc kubenswrapper[4903]: I1202 23:18:32.922855 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerStarted","Data":"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2"} Dec 02 23:18:33 crc kubenswrapper[4903]: I1202 23:18:33.008444 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.958690 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerStarted","Data":"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b"} Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.959234 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.958969 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="sg-core" containerID="cri-o://e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41" gracePeriod=30 Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.958964 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-central-agent" containerID="cri-o://e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3" gracePeriod=30 Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.959049 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="proxy-httpd" containerID="cri-o://73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b" gracePeriod=30 Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.959024 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-notification-agent" containerID="cri-o://996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2" gracePeriod=30 Dec 02 23:18:34 crc kubenswrapper[4903]: I1202 23:18:34.999478 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=11.752436792 podStartE2EDuration="13.99944344s" podCreationTimestamp="2025-12-02 23:18:21 +0000 UTC" firstStartedPulling="2025-12-02 23:18:31.5088275 +0000 UTC m=+1250.217381783" lastFinishedPulling="2025-12-02 23:18:33.755834138 +0000 UTC m=+1252.464388431" observedRunningTime="2025-12-02 23:18:34.986546356 +0000 UTC m=+1253.695100679" watchObservedRunningTime="2025-12-02 23:18:34.99944344 +0000 UTC m=+1253.707997763" Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.972892 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerID="73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b" exitCode=0 Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.973187 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerID="e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41" exitCode=2 Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.973196 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerID="996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2" exitCode=0 Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.972989 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerDied","Data":"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b"} Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.973227 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerDied","Data":"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41"} Dec 02 23:18:35 crc kubenswrapper[4903]: I1202 23:18:35.973242 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerDied","Data":"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2"} Dec 02 23:18:41 crc kubenswrapper[4903]: I1202 23:18:41.985843 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.011778 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.011932 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012015 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjjd8\" (UniqueName: \"kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012086 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012163 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012237 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml\") pod \"a7dab846-dbcc-4406-a2d7-d7348d71a350\" (UID: \"a7dab846-dbcc-4406-a2d7-d7348d71a350\") " Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012239 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.012274 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.013692 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.013786 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7dab846-dbcc-4406-a2d7-d7348d71a350-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.021454 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8" (OuterVolumeSpecName: "kube-api-access-tjjd8") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "kube-api-access-tjjd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.023925 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts" (OuterVolumeSpecName: "scripts") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.052284 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerID="e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3" exitCode=0 Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.052360 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerDied","Data":"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3"} Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.052391 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7dab846-dbcc-4406-a2d7-d7348d71a350","Type":"ContainerDied","Data":"9f4f6a927f5f327382bfb6d9393c976dc8fe724192051ea26a8612a284a011d8"} Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.052468 4903 scope.go:117] "RemoveContainer" containerID="73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.052761 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.056612 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.105808 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.115236 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjjd8\" (UniqueName: \"kubernetes.io/projected/a7dab846-dbcc-4406-a2d7-d7348d71a350-kube-api-access-tjjd8\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.115276 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.115290 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.115302 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.132995 4903 scope.go:117] "RemoveContainer" containerID="e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.138252 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data" (OuterVolumeSpecName: "config-data") pod "a7dab846-dbcc-4406-a2d7-d7348d71a350" (UID: "a7dab846-dbcc-4406-a2d7-d7348d71a350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.150608 4903 scope.go:117] "RemoveContainer" containerID="996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.172494 4903 scope.go:117] "RemoveContainer" containerID="e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.198184 4903 scope.go:117] "RemoveContainer" containerID="73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.198863 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b\": container with ID starting with 73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b not found: ID does not exist" containerID="73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.198998 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b"} err="failed to get container status \"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b\": rpc error: code = NotFound desc = could not find container \"73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b\": container with ID starting with 73db5093b5294bfc2becc96ce175c39bbd71496e902a9e6d9452b38b19258c8b not found: ID does not exist" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.199104 4903 scope.go:117] "RemoveContainer" containerID="e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.199638 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41\": container with ID starting with e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41 not found: ID does not exist" containerID="e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.199690 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41"} err="failed to get container status \"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41\": rpc error: code = NotFound desc = could not find container \"e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41\": container with ID starting with e4257bbe29b0b8d67959052b33a2744f3dd498412e5d0b0fbbb040e03b34ed41 not found: ID does not exist" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.199717 4903 scope.go:117] "RemoveContainer" containerID="996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.200087 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2\": container with ID starting with 996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2 not found: ID does not exist" containerID="996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.200147 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2"} err="failed to get container status \"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2\": rpc error: code = NotFound desc = could not find container \"996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2\": container with ID starting with 996f19ab99aad6dbea12d18789f0fda8c9d3ec1ba5469c2f8e82de22975652b2 not found: ID does not exist" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.200187 4903 scope.go:117] "RemoveContainer" containerID="e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.200575 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3\": container with ID starting with e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3 not found: ID does not exist" containerID="e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.200604 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3"} err="failed to get container status \"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3\": rpc error: code = NotFound desc = could not find container \"e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3\": container with ID starting with e6e59b9dceef587a985872a36f96d34fd48311a61abc1ca597feab07f3aaa4c3 not found: ID does not exist" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.217155 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7dab846-dbcc-4406-a2d7-d7348d71a350-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.401709 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.414876 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.446425 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.446904 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="proxy-httpd" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.446929 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="proxy-httpd" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.446952 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-central-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.446961 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-central-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.446975 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-notification-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.446984 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-notification-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: E1202 23:18:42.447022 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="sg-core" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.447030 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="sg-core" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.447253 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="sg-core" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.447284 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-notification-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.447301 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="ceilometer-central-agent" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.447318 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" containerName="proxy-httpd" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.449941 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.454115 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.454134 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.498485 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522232 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522295 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522386 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522440 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522569 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522745 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.522810 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhsn\" (UniqueName: \"kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.624590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhsn\" (UniqueName: \"kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.628753 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.628788 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.628863 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.628910 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.628981 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.629598 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.629778 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.633755 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.634434 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.634795 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.637752 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.642325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.646538 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhsn\" (UniqueName: \"kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn\") pod \"ceilometer-0\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " pod="openstack/ceilometer-0" Dec 02 23:18:42 crc kubenswrapper[4903]: I1202 23:18:42.775421 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:18:43 crc kubenswrapper[4903]: I1202 23:18:43.330902 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:18:43 crc kubenswrapper[4903]: I1202 23:18:43.632429 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7dab846-dbcc-4406-a2d7-d7348d71a350" path="/var/lib/kubelet/pods/a7dab846-dbcc-4406-a2d7-d7348d71a350/volumes" Dec 02 23:18:44 crc kubenswrapper[4903]: I1202 23:18:44.111536 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerStarted","Data":"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a"} Dec 02 23:18:44 crc kubenswrapper[4903]: I1202 23:18:44.111608 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerStarted","Data":"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb"} Dec 02 23:18:44 crc kubenswrapper[4903]: I1202 23:18:44.111628 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerStarted","Data":"62f5f02487b323e2ca66853dc439d1a3e7d2982857431cb0157a1bacd8f25a93"} Dec 02 23:18:44 crc kubenswrapper[4903]: I1202 23:18:44.613227 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:44 crc kubenswrapper[4903]: E1202 23:18:44.613887 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6de4a117-0c91-47f4-a80d-278debb3ea60)\"" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" Dec 02 23:18:45 crc kubenswrapper[4903]: I1202 23:18:45.129159 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerStarted","Data":"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0"} Dec 02 23:18:45 crc kubenswrapper[4903]: I1202 23:18:45.132709 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hjz85" event={"ID":"6af62260-60e9-49b0-84b9-3f9cf7361c79","Type":"ContainerStarted","Data":"cc0155c9d4cbbc86352c6be117bc554a3e26d566a6928fa8e85b3dad046a270c"} Dec 02 23:18:45 crc kubenswrapper[4903]: I1202 23:18:45.160358 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hjz85" podStartSLOduration=2.168009342 podStartE2EDuration="28.160340397s" podCreationTimestamp="2025-12-02 23:18:17 +0000 UTC" firstStartedPulling="2025-12-02 23:18:18.697165211 +0000 UTC m=+1237.405719504" lastFinishedPulling="2025-12-02 23:18:44.689496256 +0000 UTC m=+1263.398050559" observedRunningTime="2025-12-02 23:18:45.155819551 +0000 UTC m=+1263.864373924" watchObservedRunningTime="2025-12-02 23:18:45.160340397 +0000 UTC m=+1263.868894680" Dec 02 23:18:47 crc kubenswrapper[4903]: I1202 23:18:47.162819 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerStarted","Data":"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2"} Dec 02 23:18:47 crc kubenswrapper[4903]: I1202 23:18:47.165068 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:18:47 crc kubenswrapper[4903]: I1202 23:18:47.213336 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.383065681 podStartE2EDuration="5.213292439s" podCreationTimestamp="2025-12-02 23:18:42 +0000 UTC" firstStartedPulling="2025-12-02 23:18:43.326453504 +0000 UTC m=+1262.035007787" lastFinishedPulling="2025-12-02 23:18:46.156680232 +0000 UTC m=+1264.865234545" observedRunningTime="2025-12-02 23:18:47.19595607 +0000 UTC m=+1265.904510353" watchObservedRunningTime="2025-12-02 23:18:47.213292439 +0000 UTC m=+1265.921846722" Dec 02 23:18:57 crc kubenswrapper[4903]: I1202 23:18:57.281383 4903 generic.go:334] "Generic (PLEG): container finished" podID="6af62260-60e9-49b0-84b9-3f9cf7361c79" containerID="cc0155c9d4cbbc86352c6be117bc554a3e26d566a6928fa8e85b3dad046a270c" exitCode=0 Dec 02 23:18:57 crc kubenswrapper[4903]: I1202 23:18:57.282018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hjz85" event={"ID":"6af62260-60e9-49b0-84b9-3f9cf7361c79","Type":"ContainerDied","Data":"cc0155c9d4cbbc86352c6be117bc554a3e26d566a6928fa8e85b3dad046a270c"} Dec 02 23:18:57 crc kubenswrapper[4903]: I1202 23:18:57.612448 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.292929 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerStarted","Data":"869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5"} Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.643865 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.823936 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data\") pod \"6af62260-60e9-49b0-84b9-3f9cf7361c79\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.824276 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5cb\" (UniqueName: \"kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb\") pod \"6af62260-60e9-49b0-84b9-3f9cf7361c79\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.824432 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle\") pod \"6af62260-60e9-49b0-84b9-3f9cf7361c79\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.824536 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts\") pod \"6af62260-60e9-49b0-84b9-3f9cf7361c79\" (UID: \"6af62260-60e9-49b0-84b9-3f9cf7361c79\") " Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.834969 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb" (OuterVolumeSpecName: "kube-api-access-mg5cb") pod "6af62260-60e9-49b0-84b9-3f9cf7361c79" (UID: "6af62260-60e9-49b0-84b9-3f9cf7361c79"). InnerVolumeSpecName "kube-api-access-mg5cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.839919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts" (OuterVolumeSpecName: "scripts") pod "6af62260-60e9-49b0-84b9-3f9cf7361c79" (UID: "6af62260-60e9-49b0-84b9-3f9cf7361c79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.863598 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data" (OuterVolumeSpecName: "config-data") pod "6af62260-60e9-49b0-84b9-3f9cf7361c79" (UID: "6af62260-60e9-49b0-84b9-3f9cf7361c79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.886226 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af62260-60e9-49b0-84b9-3f9cf7361c79" (UID: "6af62260-60e9-49b0-84b9-3f9cf7361c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.927012 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.927062 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5cb\" (UniqueName: \"kubernetes.io/projected/6af62260-60e9-49b0-84b9-3f9cf7361c79-kube-api-access-mg5cb\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.927083 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.927099 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af62260-60e9-49b0-84b9-3f9cf7361c79-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.966621 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:58 crc kubenswrapper[4903]: I1202 23:18:58.998628 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.309445 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hjz85" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.310314 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hjz85" event={"ID":"6af62260-60e9-49b0-84b9-3f9cf7361c79","Type":"ContainerDied","Data":"bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189"} Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.310383 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb8173a4adb38ec1bee0952cb3ef98c5e02c42c4d37b2cdd8edd3b205ad4c189" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.311480 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.356963 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.400310 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.434243 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:18:59 crc kubenswrapper[4903]: E1202 23:18:59.434926 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" containerName="nova-cell0-conductor-db-sync" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.435015 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" containerName="nova-cell0-conductor-db-sync" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.435408 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" containerName="nova-cell0-conductor-db-sync" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.436305 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.438956 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zl2fh" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.440936 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.455272 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.539852 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.539994 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszzw\" (UniqueName: \"kubernetes.io/projected/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-kube-api-access-sszzw\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.540023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.642204 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sszzw\" (UniqueName: \"kubernetes.io/projected/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-kube-api-access-sszzw\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.642301 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.642421 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.647203 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.647694 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.668828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszzw\" (UniqueName: \"kubernetes.io/projected/de5b4dd8-9abd-423d-af40-fed7d5fc1de0-kube-api-access-sszzw\") pod \"nova-cell0-conductor-0\" (UID: \"de5b4dd8-9abd-423d-af40-fed7d5fc1de0\") " pod="openstack/nova-cell0-conductor-0" Dec 02 23:18:59 crc kubenswrapper[4903]: I1202 23:18:59.765182 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 23:19:00 crc kubenswrapper[4903]: W1202 23:19:00.308664 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5b4dd8_9abd_423d_af40_fed7d5fc1de0.slice/crio-c29dccae4c65cb2869f4e374cfb31e5edb844a93fccd4faae6c13352ce7bde03 WatchSource:0}: Error finding container c29dccae4c65cb2869f4e374cfb31e5edb844a93fccd4faae6c13352ce7bde03: Status 404 returned error can't find the container with id c29dccae4c65cb2869f4e374cfb31e5edb844a93fccd4faae6c13352ce7bde03 Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.310436 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.338622 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"de5b4dd8-9abd-423d-af40-fed7d5fc1de0","Type":"ContainerStarted","Data":"c29dccae4c65cb2869f4e374cfb31e5edb844a93fccd4faae6c13352ce7bde03"} Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.549627 4903 scope.go:117] "RemoveContainer" containerID="7bd93339abc34a32b35e45a92a0051e22d7197e73917df8a8ab7f2ff5f0a0581" Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.582408 4903 scope.go:117] "RemoveContainer" containerID="46f04c12f3f536cca3af18dd2a21972311556465ff202bd1f8d295d073b6aa8f" Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.621284 4903 scope.go:117] "RemoveContainer" containerID="8c5cfb83a5272ae46da9b4411f0d001ac159465e94983681887e5ed948143225" Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.649732 4903 scope.go:117] "RemoveContainer" containerID="bc0a845a0b1a0efddd676e89403deb3ed49f74a5ec764b902525e7b0f427149f" Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.674105 4903 scope.go:117] "RemoveContainer" containerID="14766c5cdfa950876a4f9c4c65f33f602772484518b0498b89dc86f38a53d718" Dec 02 23:19:00 crc kubenswrapper[4903]: I1202 23:19:00.701442 4903 scope.go:117] "RemoveContainer" containerID="993d44269a3a21582fe408533087b5f9c2a35bd1f82d2ef37b1a771bf1890cf4" Dec 02 23:19:01 crc kubenswrapper[4903]: I1202 23:19:01.353832 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"de5b4dd8-9abd-423d-af40-fed7d5fc1de0","Type":"ContainerStarted","Data":"c9ea0c837c64264e17c83ea2a1d68a8b700a51a7f3928057f9be5f442bc0cebb"} Dec 02 23:19:01 crc kubenswrapper[4903]: I1202 23:19:01.353888 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" containerID="cri-o://869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" gracePeriod=30 Dec 02 23:19:01 crc kubenswrapper[4903]: I1202 23:19:01.354441 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 23:19:01 crc kubenswrapper[4903]: I1202 23:19:01.391158 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.391144547 podStartE2EDuration="2.391144547s" podCreationTimestamp="2025-12-02 23:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:01.380850258 +0000 UTC m=+1280.089404541" watchObservedRunningTime="2025-12-02 23:19:01.391144547 +0000 UTC m=+1280.099698820" Dec 02 23:19:08 crc kubenswrapper[4903]: E1202 23:19:08.969049 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:19:08 crc kubenswrapper[4903]: E1202 23:19:08.971423 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:19:08 crc kubenswrapper[4903]: E1202 23:19:08.972976 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 23:19:08 crc kubenswrapper[4903]: E1202 23:19:08.973023 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:09 crc kubenswrapper[4903]: I1202 23:19:09.862358 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.309613 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lltzt"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.311187 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.313714 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.314097 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.334036 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lltzt"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.467527 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.469100 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.472716 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.476350 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.476748 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb5t\" (UniqueName: \"kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.476842 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.476946 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.476983 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.484056 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.491290 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.492060 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.508111 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.580983 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581033 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb5t\" (UniqueName: \"kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581058 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581113 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581133 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxlv\" (UniqueName: \"kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581194 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581216 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581234 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581280 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfs59\" (UniqueName: \"kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.581304 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.589477 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.589831 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.592080 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.599075 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.602287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.610363 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.618630 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb5t\" (UniqueName: \"kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t\") pod \"nova-cell0-cell-mapping-lltzt\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.625909 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.633351 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.688136 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxlv\" (UniqueName: \"kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.688623 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.688942 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689000 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689065 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689552 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfs59\" (UniqueName: \"kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznrm\" (UniqueName: \"kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689740 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689779 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.689803 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.690055 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.704043 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.746360 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.750315 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.750325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.754929 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfs59\" (UniqueName: \"kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59\") pod \"nova-scheduler-0\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.758861 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxlv\" (UniqueName: \"kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv\") pod \"nova-api-0\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.759200 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.784072 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.793576 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.799059 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.804779 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.804842 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.804994 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznrm\" (UniqueName: \"kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.805126 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.810202 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.816343 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.825415 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznrm\" (UniqueName: \"kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.826497 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.827269 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.843435 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data\") pod \"nova-metadata-0\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " pod="openstack/nova-metadata-0" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.853159 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.855644 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.883937 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.906821 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnzd\" (UniqueName: \"kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.906887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.906931 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.906976 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.907112 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:10 crc kubenswrapper[4903]: I1202 23:19:10.907373 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.008887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfst\" (UniqueName: \"kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009125 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009161 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009198 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnzd\" (UniqueName: \"kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009233 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009365 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009389 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009432 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.009447 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.011632 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.011696 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.011911 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.013695 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.013807 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.033576 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnzd\" (UniqueName: \"kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd\") pod \"dnsmasq-dns-74bfcddf47-tkg4k\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.110871 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfst\" (UniqueName: \"kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.110934 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.111031 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.115478 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.115665 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.126933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfst\" (UniqueName: \"kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.133944 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.160392 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.188991 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.307325 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lltzt"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.402022 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:11 crc kubenswrapper[4903]: W1202 23:19:11.416130 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod651e93ad_d51d_4bb1_91ef_0784eb952c71.slice/crio-423f3d21ecd7f423782ea6f793fad5a89516db297c94463f4a0999c8c24fa78e WatchSource:0}: Error finding container 423f3d21ecd7f423782ea6f793fad5a89516db297c94463f4a0999c8c24fa78e: Status 404 returned error can't find the container with id 423f3d21ecd7f423782ea6f793fad5a89516db297c94463f4a0999c8c24fa78e Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.484216 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651e93ad-d51d-4bb1-91ef-0784eb952c71","Type":"ContainerStarted","Data":"423f3d21ecd7f423782ea6f793fad5a89516db297c94463f4a0999c8c24fa78e"} Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.490802 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lltzt" event={"ID":"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e","Type":"ContainerStarted","Data":"54bc161326da58f4266e75297bffcbc79172c3bfabab1862939b350f69b44d8b"} Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.526926 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.550031 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-t5jc4"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.551535 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.555074 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.555726 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.565960 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-t5jc4"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.623416 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.623489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.623563 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.623677 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7f8\" (UniqueName: \"kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.713365 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.725402 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7f8\" (UniqueName: \"kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.725520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.725606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.725708 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.728763 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.729362 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.730266 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.730493 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.739318 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7f8\" (UniqueName: \"kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8\") pod \"nova-cell1-conductor-db-sync-t5jc4\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.844230 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:11 crc kubenswrapper[4903]: I1202 23:19:11.874705 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.300496 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-t5jc4"] Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.509678 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e","Type":"ContainerStarted","Data":"faa5154357b84ca6a22cd957e36a12ce0a1063537081786e1da99abef2a2d194"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.513216 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerStarted","Data":"cb19031f53652701065bc882593bda1c2750150f733211fce225ad92bd25ff4f"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.515475 4903 generic.go:334] "Generic (PLEG): container finished" podID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerID="a716027066d338e2e3b6141e352f3e9824b2ee70d8d13230e5432be890cf11f2" exitCode=0 Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.515518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" event={"ID":"aa730e48-0dda-48df-9675-d7b3fa3358d1","Type":"ContainerDied","Data":"a716027066d338e2e3b6141e352f3e9824b2ee70d8d13230e5432be890cf11f2"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.515534 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" event={"ID":"aa730e48-0dda-48df-9675-d7b3fa3358d1","Type":"ContainerStarted","Data":"536cdde2bb74c9269fd3e87e1dc2956139024939ad82b13dfedcc8a67458c549"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.520375 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lltzt" event={"ID":"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e","Type":"ContainerStarted","Data":"0430dcc5aa8a1e41c2dfa5b2c7b415e87fce339154a8388b1654d601156c82f7"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.522733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerStarted","Data":"bfb1fccc4c1997e4133fb4d81f80146a4dd14cab6ab0180b81ae5a7d8423f0fa"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.530595 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" event={"ID":"b7768c2b-8cda-4ff7-b845-d2762445cb9e","Type":"ContainerStarted","Data":"b425fb41f33d997e97a698ddd938c3c9309f6689430dec633d694cf6224326a3"} Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.567588 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" podStartSLOduration=1.5675700670000001 podStartE2EDuration="1.567570067s" podCreationTimestamp="2025-12-02 23:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:12.550902943 +0000 UTC m=+1291.259457226" watchObservedRunningTime="2025-12-02 23:19:12.567570067 +0000 UTC m=+1291.276124340" Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.594199 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lltzt" podStartSLOduration=2.594182313 podStartE2EDuration="2.594182313s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:12.564540353 +0000 UTC m=+1291.273094636" watchObservedRunningTime="2025-12-02 23:19:12.594182313 +0000 UTC m=+1291.302736596" Dec 02 23:19:12 crc kubenswrapper[4903]: I1202 23:19:12.793009 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 23:19:13 crc kubenswrapper[4903]: I1202 23:19:13.546384 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" event={"ID":"b7768c2b-8cda-4ff7-b845-d2762445cb9e","Type":"ContainerStarted","Data":"af32ab0adc21ad36d3f3f69152af8e2bd8d59cb9db0dc0caff8d04b157f830b4"} Dec 02 23:19:13 crc kubenswrapper[4903]: I1202 23:19:13.551355 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" event={"ID":"aa730e48-0dda-48df-9675-d7b3fa3358d1","Type":"ContainerStarted","Data":"158115072b95e7b226b9d6babda6213ce44b89d3090d21a48e08d8217cdfdbc0"} Dec 02 23:19:13 crc kubenswrapper[4903]: I1202 23:19:13.551447 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:13 crc kubenswrapper[4903]: I1202 23:19:13.575190 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" podStartSLOduration=3.575167462 podStartE2EDuration="3.575167462s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:13.572085806 +0000 UTC m=+1292.280640099" watchObservedRunningTime="2025-12-02 23:19:13.575167462 +0000 UTC m=+1292.283721745" Dec 02 23:19:14 crc kubenswrapper[4903]: I1202 23:19:14.954083 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:14 crc kubenswrapper[4903]: I1202 23:19:14.966288 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.599725 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651e93ad-d51d-4bb1-91ef-0784eb952c71","Type":"ContainerStarted","Data":"1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725"} Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.604636 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerStarted","Data":"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296"} Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.607195 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerStarted","Data":"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce"} Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.607236 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerStarted","Data":"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80"} Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.607259 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-log" containerID="cri-o://03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80" gracePeriod=30 Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.607296 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-metadata" containerID="cri-o://0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce" gracePeriod=30 Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.617686 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb" gracePeriod=30 Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.625929 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e","Type":"ContainerStarted","Data":"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb"} Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.629295 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.089165656 podStartE2EDuration="5.629273658s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="2025-12-02 23:19:11.419236331 +0000 UTC m=+1290.127790604" lastFinishedPulling="2025-12-02 23:19:14.959344323 +0000 UTC m=+1293.667898606" observedRunningTime="2025-12-02 23:19:15.616508516 +0000 UTC m=+1294.325062789" watchObservedRunningTime="2025-12-02 23:19:15.629273658 +0000 UTC m=+1294.337827941" Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.643717 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.386986438 podStartE2EDuration="5.643697814s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="2025-12-02 23:19:11.712390884 +0000 UTC m=+1290.420945167" lastFinishedPulling="2025-12-02 23:19:14.96910226 +0000 UTC m=+1293.677656543" observedRunningTime="2025-12-02 23:19:15.638236828 +0000 UTC m=+1294.346791111" watchObservedRunningTime="2025-12-02 23:19:15.643697814 +0000 UTC m=+1294.352252097" Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.659369 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.551496245 podStartE2EDuration="5.659346187s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="2025-12-02 23:19:11.858113372 +0000 UTC m=+1290.566667655" lastFinishedPulling="2025-12-02 23:19:14.965963314 +0000 UTC m=+1293.674517597" observedRunningTime="2025-12-02 23:19:15.655141957 +0000 UTC m=+1294.363696240" watchObservedRunningTime="2025-12-02 23:19:15.659346187 +0000 UTC m=+1294.367900470" Dec 02 23:19:15 crc kubenswrapper[4903]: I1202 23:19:15.817724 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.134908 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.135305 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.190088 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.626039 4903 generic.go:334] "Generic (PLEG): container finished" podID="028f476a-ab2c-424d-aeda-70903e28d17d" containerID="03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80" exitCode=143 Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.626129 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerDied","Data":"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80"} Dec 02 23:19:16 crc kubenswrapper[4903]: I1202 23:19:16.629891 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerStarted","Data":"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9"} Dec 02 23:19:17 crc kubenswrapper[4903]: I1202 23:19:17.783869 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.373261371 podStartE2EDuration="7.783823739s" podCreationTimestamp="2025-12-02 23:19:10 +0000 UTC" firstStartedPulling="2025-12-02 23:19:11.549627873 +0000 UTC m=+1290.258182156" lastFinishedPulling="2025-12-02 23:19:14.960190231 +0000 UTC m=+1293.668744524" observedRunningTime="2025-12-02 23:19:16.654121199 +0000 UTC m=+1295.362675482" watchObservedRunningTime="2025-12-02 23:19:17.783823739 +0000 UTC m=+1296.492378022" Dec 02 23:19:17 crc kubenswrapper[4903]: I1202 23:19:17.787366 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:17 crc kubenswrapper[4903]: I1202 23:19:17.787556 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" containerName="kube-state-metrics" containerID="cri-o://b4ab52ad181c2185f46d97c6f90fcf70aee227ad20e6a21ab39a6eef31041ca6" gracePeriod=30 Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.669580 4903 generic.go:334] "Generic (PLEG): container finished" podID="2372c82e-7656-4307-946c-155ec0d8cb3d" containerID="b4ab52ad181c2185f46d97c6f90fcf70aee227ad20e6a21ab39a6eef31041ca6" exitCode=2 Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.670247 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2372c82e-7656-4307-946c-155ec0d8cb3d","Type":"ContainerDied","Data":"b4ab52ad181c2185f46d97c6f90fcf70aee227ad20e6a21ab39a6eef31041ca6"} Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.801169 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.882780 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whl67\" (UniqueName: \"kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67\") pod \"2372c82e-7656-4307-946c-155ec0d8cb3d\" (UID: \"2372c82e-7656-4307-946c-155ec0d8cb3d\") " Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.895833 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67" (OuterVolumeSpecName: "kube-api-access-whl67") pod "2372c82e-7656-4307-946c-155ec0d8cb3d" (UID: "2372c82e-7656-4307-946c-155ec0d8cb3d"). InnerVolumeSpecName "kube-api-access-whl67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:18 crc kubenswrapper[4903]: I1202 23:19:18.984599 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whl67\" (UniqueName: \"kubernetes.io/projected/2372c82e-7656-4307-946c-155ec0d8cb3d-kube-api-access-whl67\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.683503 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2372c82e-7656-4307-946c-155ec0d8cb3d","Type":"ContainerDied","Data":"ccb7866320f01c6bda0137bc67a76d6abee63a337e9d30ca81581db00ba51d31"} Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.683827 4903 scope.go:117] "RemoveContainer" containerID="b4ab52ad181c2185f46d97c6f90fcf70aee227ad20e6a21ab39a6eef31041ca6" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.683989 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.692387 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.692764 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-central-agent" containerID="cri-o://86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb" gracePeriod=30 Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.693501 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="sg-core" containerID="cri-o://120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0" gracePeriod=30 Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.693548 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="proxy-httpd" containerID="cri-o://590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2" gracePeriod=30 Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.693563 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-notification-agent" containerID="cri-o://56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a" gracePeriod=30 Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.717337 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.728590 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.754855 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:19 crc kubenswrapper[4903]: E1202 23:19:19.755414 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" containerName="kube-state-metrics" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.755435 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" containerName="kube-state-metrics" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.755732 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" containerName="kube-state-metrics" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.756559 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.760468 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.760797 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.765464 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.799030 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.799074 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.799105 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vtk\" (UniqueName: \"kubernetes.io/projected/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-api-access-95vtk\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.799134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.901609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.901691 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.901737 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vtk\" (UniqueName: \"kubernetes.io/projected/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-api-access-95vtk\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.901776 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.910434 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.910514 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.915895 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:19 crc kubenswrapper[4903]: I1202 23:19:19.919426 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vtk\" (UniqueName: \"kubernetes.io/projected/8a0962e8-541d-4a75-b629-613d6d19f47e-kube-api-access-95vtk\") pod \"kube-state-metrics-0\" (UID: \"8a0962e8-541d-4a75-b629-613d6d19f47e\") " pod="openstack/kube-state-metrics-0" Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.081111 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.553256 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.693265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a0962e8-541d-4a75-b629-613d6d19f47e","Type":"ContainerStarted","Data":"6ce767ec05e46b141468386362b167ff4f8341a902433a82db306bb13e310d43"} Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696805 4903 generic.go:334] "Generic (PLEG): container finished" podID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerID="590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2" exitCode=0 Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696834 4903 generic.go:334] "Generic (PLEG): container finished" podID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerID="120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0" exitCode=2 Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696841 4903 generic.go:334] "Generic (PLEG): container finished" podID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerID="86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb" exitCode=0 Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696872 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerDied","Data":"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2"} Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696906 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerDied","Data":"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0"} Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.696917 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerDied","Data":"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb"} Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.796428 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.796477 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.817261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:19:20 crc kubenswrapper[4903]: I1202 23:19:20.848801 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.162726 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.219484 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.220194 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="dnsmasq-dns" containerID="cri-o://340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc" gracePeriod=10 Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.652331 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2372c82e-7656-4307-946c-155ec0d8cb3d" path="/var/lib/kubelet/pods/2372c82e-7656-4307-946c-155ec0d8cb3d/volumes" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.678452 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.724771 4903 generic.go:334] "Generic (PLEG): container finished" podID="2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" containerID="0430dcc5aa8a1e41c2dfa5b2c7b415e87fce339154a8388b1654d601156c82f7" exitCode=0 Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.724848 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lltzt" event={"ID":"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e","Type":"ContainerDied","Data":"0430dcc5aa8a1e41c2dfa5b2c7b415e87fce339154a8388b1654d601156c82f7"} Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.728796 4903 generic.go:334] "Generic (PLEG): container finished" podID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerID="340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc" exitCode=0 Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.728925 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.729334 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" event={"ID":"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27","Type":"ContainerDied","Data":"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc"} Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.729365 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffc74bd45-2xt7m" event={"ID":"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27","Type":"ContainerDied","Data":"9860acf549d083691ebe1ad6e3a08184125470eb8f1ed3335a55b825ddf99c13"} Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.729382 4903 scope.go:117] "RemoveContainer" containerID="340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.739104 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a0962e8-541d-4a75-b629-613d6d19f47e","Type":"ContainerStarted","Data":"6f404206fcf602d470b50e11157ea98a0140b7617b39b9bda211ad8bc67f4ac2"} Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.739514 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748523 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748672 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748692 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748770 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.748867 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgbpz\" (UniqueName: \"kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz\") pod \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\" (UID: \"c9ae404e-88a5-4e23-a7f8-e2e1198cfc27\") " Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.761806 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz" (OuterVolumeSpecName: "kube-api-access-kgbpz") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "kube-api-access-kgbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.764783 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.357025264 podStartE2EDuration="2.764758884s" podCreationTimestamp="2025-12-02 23:19:19 +0000 UTC" firstStartedPulling="2025-12-02 23:19:20.614812923 +0000 UTC m=+1299.323367206" lastFinishedPulling="2025-12-02 23:19:21.022546543 +0000 UTC m=+1299.731100826" observedRunningTime="2025-12-02 23:19:21.759560453 +0000 UTC m=+1300.468114746" watchObservedRunningTime="2025-12-02 23:19:21.764758884 +0000 UTC m=+1300.473313167" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.777323 4903 scope.go:117] "RemoveContainer" containerID="7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.792261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.815113 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.821617 4903 scope.go:117] "RemoveContainer" containerID="340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc" Dec 02 23:19:21 crc kubenswrapper[4903]: E1202 23:19:21.822500 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc\": container with ID starting with 340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc not found: ID does not exist" containerID="340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.822546 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc"} err="failed to get container status \"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc\": rpc error: code = NotFound desc = could not find container \"340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc\": container with ID starting with 340273657b53c6b856174a93cb9e8df1e43d62f5d0e0ffc49bf4345b3a134ccc not found: ID does not exist" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.822571 4903 scope.go:117] "RemoveContainer" containerID="7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f" Dec 02 23:19:21 crc kubenswrapper[4903]: E1202 23:19:21.823399 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f\": container with ID starting with 7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f not found: ID does not exist" containerID="7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.823443 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f"} err="failed to get container status \"7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f\": rpc error: code = NotFound desc = could not find container \"7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f\": container with ID starting with 7f69c54f786868ef97487714669bb35609cd8c4bc557c408c66bff746462c71f not found: ID does not exist" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.830181 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config" (OuterVolumeSpecName: "config") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.835404 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.847071 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.850851 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.850883 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.850895 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.850907 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.850919 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgbpz\" (UniqueName: \"kubernetes.io/projected/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-kube-api-access-kgbpz\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.866205 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" (UID: "c9ae404e-88a5-4e23-a7f8-e2e1198cfc27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.877882 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.878170 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:21 crc kubenswrapper[4903]: I1202 23:19:21.952365 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:22 crc kubenswrapper[4903]: I1202 23:19:22.068408 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:19:22 crc kubenswrapper[4903]: I1202 23:19:22.079985 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffc74bd45-2xt7m"] Dec 02 23:19:22 crc kubenswrapper[4903]: I1202 23:19:22.753801 4903 generic.go:334] "Generic (PLEG): container finished" podID="b7768c2b-8cda-4ff7-b845-d2762445cb9e" containerID="af32ab0adc21ad36d3f3f69152af8e2bd8d59cb9db0dc0caff8d04b157f830b4" exitCode=0 Dec 02 23:19:22 crc kubenswrapper[4903]: I1202 23:19:22.753862 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" event={"ID":"b7768c2b-8cda-4ff7-b845-d2762445cb9e","Type":"ContainerDied","Data":"af32ab0adc21ad36d3f3f69152af8e2bd8d59cb9db0dc0caff8d04b157f830b4"} Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.071228 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.071782 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.160069 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.279202 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdb5t\" (UniqueName: \"kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t\") pod \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.279601 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts\") pod \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.279626 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data\") pod \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.279902 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle\") pod \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\" (UID: \"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e\") " Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.285088 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts" (OuterVolumeSpecName: "scripts") pod "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" (UID: "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.286821 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t" (OuterVolumeSpecName: "kube-api-access-xdb5t") pod "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" (UID: "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e"). InnerVolumeSpecName "kube-api-access-xdb5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.309153 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data" (OuterVolumeSpecName: "config-data") pod "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" (UID: "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.320798 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" (UID: "2607960d-5ee6-4c49-9c3c-3b8083b4bb9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.387061 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.387446 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdb5t\" (UniqueName: \"kubernetes.io/projected/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-kube-api-access-xdb5t\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.387523 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.387534 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.623873 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" path="/var/lib/kubelet/pods/c9ae404e-88a5-4e23-a7f8-e2e1198cfc27/volumes" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.773956 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lltzt" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.774818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lltzt" event={"ID":"2607960d-5ee6-4c49-9c3c-3b8083b4bb9e","Type":"ContainerDied","Data":"54bc161326da58f4266e75297bffcbc79172c3bfabab1862939b350f69b44d8b"} Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.774869 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54bc161326da58f4266e75297bffcbc79172c3bfabab1862939b350f69b44d8b" Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.974932 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.975298 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-log" containerID="cri-o://b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296" gracePeriod=30 Dec 02 23:19:23 crc kubenswrapper[4903]: I1202 23:19:23.975881 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-api" containerID="cri-o://cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9" gracePeriod=30 Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.010312 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.010493 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerName="nova-scheduler-scheduler" containerID="cri-o://1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" gracePeriod=30 Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.319913 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.414164 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7f8\" (UniqueName: \"kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8\") pod \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.414343 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle\") pod \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.414421 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts\") pod \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.414527 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data\") pod \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\" (UID: \"b7768c2b-8cda-4ff7-b845-d2762445cb9e\") " Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.421765 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts" (OuterVolumeSpecName: "scripts") pod "b7768c2b-8cda-4ff7-b845-d2762445cb9e" (UID: "b7768c2b-8cda-4ff7-b845-d2762445cb9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.435842 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8" (OuterVolumeSpecName: "kube-api-access-tz7f8") pod "b7768c2b-8cda-4ff7-b845-d2762445cb9e" (UID: "b7768c2b-8cda-4ff7-b845-d2762445cb9e"). InnerVolumeSpecName "kube-api-access-tz7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.443429 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data" (OuterVolumeSpecName: "config-data") pod "b7768c2b-8cda-4ff7-b845-d2762445cb9e" (UID: "b7768c2b-8cda-4ff7-b845-d2762445cb9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.517150 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.517182 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.517191 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz7f8\" (UniqueName: \"kubernetes.io/projected/b7768c2b-8cda-4ff7-b845-d2762445cb9e-kube-api-access-tz7f8\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.518880 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7768c2b-8cda-4ff7-b845-d2762445cb9e" (UID: "b7768c2b-8cda-4ff7-b845-d2762445cb9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.618710 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7768c2b-8cda-4ff7-b845-d2762445cb9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.788364 4903 generic.go:334] "Generic (PLEG): container finished" podID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerID="b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296" exitCode=143 Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.788425 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerDied","Data":"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296"} Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.789813 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" event={"ID":"b7768c2b-8cda-4ff7-b845-d2762445cb9e","Type":"ContainerDied","Data":"b425fb41f33d997e97a698ddd938c3c9309f6689430dec633d694cf6224326a3"} Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.789836 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b425fb41f33d997e97a698ddd938c3c9309f6689430dec633d694cf6224326a3" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.789885 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-t5jc4" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.871379 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:19:24 crc kubenswrapper[4903]: E1202 23:19:24.872045 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="dnsmasq-dns" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872063 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="dnsmasq-dns" Dec 02 23:19:24 crc kubenswrapper[4903]: E1202 23:19:24.872080 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" containerName="nova-manage" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872086 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" containerName="nova-manage" Dec 02 23:19:24 crc kubenswrapper[4903]: E1202 23:19:24.872099 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="init" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872105 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="init" Dec 02 23:19:24 crc kubenswrapper[4903]: E1202 23:19:24.872124 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7768c2b-8cda-4ff7-b845-d2762445cb9e" containerName="nova-cell1-conductor-db-sync" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872130 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7768c2b-8cda-4ff7-b845-d2762445cb9e" containerName="nova-cell1-conductor-db-sync" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872289 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ae404e-88a5-4e23-a7f8-e2e1198cfc27" containerName="dnsmasq-dns" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872313 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7768c2b-8cda-4ff7-b845-d2762445cb9e" containerName="nova-cell1-conductor-db-sync" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872325 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" containerName="nova-manage" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.872953 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.889262 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.898048 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.924089 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.924309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:24 crc kubenswrapper[4903]: I1202 23:19:24.924377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cfv\" (UniqueName: \"kubernetes.io/projected/66f413c7-2056-4a28-bf9f-9606dcaa5f78-kube-api-access-25cfv\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.026854 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.027004 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cfv\" (UniqueName: \"kubernetes.io/projected/66f413c7-2056-4a28-bf9f-9606dcaa5f78-kube-api-access-25cfv\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.027107 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.033021 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.035287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f413c7-2056-4a28-bf9f-9606dcaa5f78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.048468 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cfv\" (UniqueName: \"kubernetes.io/projected/66f413c7-2056-4a28-bf9f-9606dcaa5f78-kube-api-access-25cfv\") pod \"nova-cell1-conductor-0\" (UID: \"66f413c7-2056-4a28-bf9f-9606dcaa5f78\") " pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.201855 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.714417 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.723812 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.808877 4903 generic.go:334] "Generic (PLEG): container finished" podID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerID="56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a" exitCode=0 Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.808930 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerDied","Data":"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a"} Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.809252 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8","Type":"ContainerDied","Data":"62f5f02487b323e2ca66853dc439d1a3e7d2982857431cb0157a1bacd8f25a93"} Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.809275 4903 scope.go:117] "RemoveContainer" containerID="590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.808988 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.818268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"66f413c7-2056-4a28-bf9f-9606dcaa5f78","Type":"ContainerStarted","Data":"e880a94670216ab570550ae185924dafb328729987b390640660b02046d18c85"} Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.819728 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.821084 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.822178 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.822246 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerName="nova-scheduler-scheduler" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.839919 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdhsn\" (UniqueName: \"kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.839966 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.839988 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.840086 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.840156 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.840216 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.840254 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data\") pod \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\" (UID: \"e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8\") " Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.841407 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.841793 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.847355 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn" (OuterVolumeSpecName: "kube-api-access-kdhsn") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "kube-api-access-kdhsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.847351 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts" (OuterVolumeSpecName: "scripts") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.856853 4903 scope.go:117] "RemoveContainer" containerID="120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.884483 4903 scope.go:117] "RemoveContainer" containerID="56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.884730 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.907923 4903 scope.go:117] "RemoveContainer" containerID="86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.938524 4903 scope.go:117] "RemoveContainer" containerID="590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2" Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.939306 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2\": container with ID starting with 590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2 not found: ID does not exist" containerID="590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.939355 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2"} err="failed to get container status \"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2\": rpc error: code = NotFound desc = could not find container \"590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2\": container with ID starting with 590d8782376ce97e339d90520cb424377ea5bd57dc7dfc1cd8f7735a3223e6e2 not found: ID does not exist" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.939382 4903 scope.go:117] "RemoveContainer" containerID="120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.939601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.939979 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0\": container with ID starting with 120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0 not found: ID does not exist" containerID="120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.940060 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0"} err="failed to get container status \"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0\": rpc error: code = NotFound desc = could not find container \"120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0\": container with ID starting with 120a65b9e41382fbcc4db8172753c708144ddc21d430539683ee2037efc509f0 not found: ID does not exist" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.940100 4903 scope.go:117] "RemoveContainer" containerID="56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a" Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.940552 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a\": container with ID starting with 56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a not found: ID does not exist" containerID="56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.940754 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a"} err="failed to get container status \"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a\": rpc error: code = NotFound desc = could not find container \"56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a\": container with ID starting with 56181b13e1274405076b767f3a034e823f204489491952ee0e1036724e22993a not found: ID does not exist" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.940879 4903 scope.go:117] "RemoveContainer" containerID="86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.942459 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdhsn\" (UniqueName: \"kubernetes.io/projected/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-kube-api-access-kdhsn\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.942671 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.942774 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.942863 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.942943 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.943026 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:25 crc kubenswrapper[4903]: E1202 23:19:25.947040 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb\": container with ID starting with 86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb not found: ID does not exist" containerID="86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.947109 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb"} err="failed to get container status \"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb\": rpc error: code = NotFound desc = could not find container \"86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb\": container with ID starting with 86f46a583e97f2c638b5770d8ecee1b15f25956b4b4d4f45a9cc3b4494f1f0fb not found: ID does not exist" Dec 02 23:19:25 crc kubenswrapper[4903]: I1202 23:19:25.976797 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data" (OuterVolumeSpecName: "config-data") pod "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" (UID: "e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.044412 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.308480 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.316847 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.350103 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.350972 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-central-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.350998 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-central-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.351045 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-notification-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351056 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-notification-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.351177 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="proxy-httpd" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351219 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="proxy-httpd" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.351247 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="sg-core" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351256 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="sg-core" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351492 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="sg-core" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351511 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="proxy-httpd" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351529 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-notification-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.351556 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" containerName="ceilometer-central-agent" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.353910 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.356753 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.356754 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.357199 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.367452 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452701 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452827 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452857 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452925 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67j7j\" (UniqueName: \"kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.452984 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.453017 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.453122 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556762 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67j7j\" (UniqueName: \"kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556894 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556947 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.556990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.557029 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.557045 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.560126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.560460 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.566792 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.567306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.567462 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.569188 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.587293 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.588713 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67j7j\" (UniqueName: \"kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j\") pod \"ceilometer-0\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.688289 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.690276 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.761826 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kxlv\" (UniqueName: \"kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv\") pod \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.761969 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle\") pod \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.762071 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data\") pod \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.762238 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs\") pod \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\" (UID: \"43ee4fc2-7757-434d-a22d-8d8e8383e5f2\") " Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.763082 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs" (OuterVolumeSpecName: "logs") pod "43ee4fc2-7757-434d-a22d-8d8e8383e5f2" (UID: "43ee4fc2-7757-434d-a22d-8d8e8383e5f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.766761 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv" (OuterVolumeSpecName: "kube-api-access-4kxlv") pod "43ee4fc2-7757-434d-a22d-8d8e8383e5f2" (UID: "43ee4fc2-7757-434d-a22d-8d8e8383e5f2"). InnerVolumeSpecName "kube-api-access-4kxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.809975 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43ee4fc2-7757-434d-a22d-8d8e8383e5f2" (UID: "43ee4fc2-7757-434d-a22d-8d8e8383e5f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.810139 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data" (OuterVolumeSpecName: "config-data") pod "43ee4fc2-7757-434d-a22d-8d8e8383e5f2" (UID: "43ee4fc2-7757-434d-a22d-8d8e8383e5f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.833223 4903 generic.go:334] "Generic (PLEG): container finished" podID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerID="cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9" exitCode=0 Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.833278 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.833325 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerDied","Data":"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9"} Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.833449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43ee4fc2-7757-434d-a22d-8d8e8383e5f2","Type":"ContainerDied","Data":"cb19031f53652701065bc882593bda1c2750150f733211fce225ad92bd25ff4f"} Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.833480 4903 scope.go:117] "RemoveContainer" containerID="cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.837214 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"66f413c7-2056-4a28-bf9f-9606dcaa5f78","Type":"ContainerStarted","Data":"a4258e03f2839d9047a3ac85cda3b452deadaf6d5529b170176d8237cece7272"} Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.838321 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.857824 4903 scope.go:117] "RemoveContainer" containerID="b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.872840 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.873091 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.873110 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.873124 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kxlv\" (UniqueName: \"kubernetes.io/projected/43ee4fc2-7757-434d-a22d-8d8e8383e5f2-kube-api-access-4kxlv\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.877805 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.877786741 podStartE2EDuration="2.877786741s" podCreationTimestamp="2025-12-02 23:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:26.871036667 +0000 UTC m=+1305.579590950" watchObservedRunningTime="2025-12-02 23:19:26.877786741 +0000 UTC m=+1305.586341024" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.908933 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.911624 4903 scope.go:117] "RemoveContainer" containerID="cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.916603 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9\": container with ID starting with cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9 not found: ID does not exist" containerID="cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.916671 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9"} err="failed to get container status \"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9\": rpc error: code = NotFound desc = could not find container \"cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9\": container with ID starting with cef480c0b6cceb3880941ee6f799164540ba753607baf7729195d0701edb72b9 not found: ID does not exist" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.916700 4903 scope.go:117] "RemoveContainer" containerID="b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.917044 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296\": container with ID starting with b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296 not found: ID does not exist" containerID="b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.917080 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296"} err="failed to get container status \"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296\": rpc error: code = NotFound desc = could not find container \"b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296\": container with ID starting with b32a589becebcb013dd2c9424e04c3a33d89f1a21bdc9034edc66e8b58da2296 not found: ID does not exist" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.928780 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.946605 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.947030 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-log" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.947041 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-log" Dec 02 23:19:26 crc kubenswrapper[4903]: E1202 23:19:26.947069 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-api" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.947075 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-api" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.947277 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-log" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.947302 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" containerName="nova-api-api" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.948386 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.951011 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.976330 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.976937 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.977009 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkjs\" (UniqueName: \"kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.977087 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: I1202 23:19:26.977162 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:26 crc kubenswrapper[4903]: W1202 23:19:26.998752 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c927ae4_891c_442d_8aab_d04ae025dc57.slice/crio-f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908 WatchSource:0}: Error finding container f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908: Status 404 returned error can't find the container with id f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908 Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.025625 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.079753 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.080176 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.080229 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkjs\" (UniqueName: \"kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.080298 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.081840 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.085877 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.086735 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.100504 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkjs\" (UniqueName: \"kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs\") pod \"nova-api-0\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.287357 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.628041 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ee4fc2-7757-434d-a22d-8d8e8383e5f2" path="/var/lib/kubelet/pods/43ee4fc2-7757-434d-a22d-8d8e8383e5f2/volumes" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.629254 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8" path="/var/lib/kubelet/pods/e18d8a09-2ab7-4c7c-a4ef-c9a9b6b13ff8/volumes" Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.768460 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:27 crc kubenswrapper[4903]: W1202 23:19:27.782309 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9245f806_00cf_493f_a896_9f494b3e27a5.slice/crio-861e78f6bbea14862685faee8c36370b9639fcd2647d1e2d22b6793651ec032b WatchSource:0}: Error finding container 861e78f6bbea14862685faee8c36370b9639fcd2647d1e2d22b6793651ec032b: Status 404 returned error can't find the container with id 861e78f6bbea14862685faee8c36370b9639fcd2647d1e2d22b6793651ec032b Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.856081 4903 generic.go:334] "Generic (PLEG): container finished" podID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerID="1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" exitCode=0 Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.856176 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651e93ad-d51d-4bb1-91ef-0784eb952c71","Type":"ContainerDied","Data":"1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725"} Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.863355 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerStarted","Data":"861e78f6bbea14862685faee8c36370b9639fcd2647d1e2d22b6793651ec032b"} Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.919214 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerStarted","Data":"1c45e13c204868184c4147582782723d0539383f7e1969913b4e4ae26849f05f"} Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.919254 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerStarted","Data":"136b4132531bbed69fd27f195113f8c40a08026447ce5fc23cbfa6d7cad84e03"} Dec 02 23:19:27 crc kubenswrapper[4903]: I1202 23:19:27.919264 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerStarted","Data":"f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908"} Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.363336 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.434764 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle\") pod \"651e93ad-d51d-4bb1-91ef-0784eb952c71\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.434828 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data\") pod \"651e93ad-d51d-4bb1-91ef-0784eb952c71\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.434859 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfs59\" (UniqueName: \"kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59\") pod \"651e93ad-d51d-4bb1-91ef-0784eb952c71\" (UID: \"651e93ad-d51d-4bb1-91ef-0784eb952c71\") " Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.444631 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59" (OuterVolumeSpecName: "kube-api-access-dfs59") pod "651e93ad-d51d-4bb1-91ef-0784eb952c71" (UID: "651e93ad-d51d-4bb1-91ef-0784eb952c71"). InnerVolumeSpecName "kube-api-access-dfs59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.492059 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "651e93ad-d51d-4bb1-91ef-0784eb952c71" (UID: "651e93ad-d51d-4bb1-91ef-0784eb952c71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.493021 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data" (OuterVolumeSpecName: "config-data") pod "651e93ad-d51d-4bb1-91ef-0784eb952c71" (UID: "651e93ad-d51d-4bb1-91ef-0784eb952c71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.537419 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.537458 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651e93ad-d51d-4bb1-91ef-0784eb952c71-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.537471 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfs59\" (UniqueName: \"kubernetes.io/projected/651e93ad-d51d-4bb1-91ef-0784eb952c71-kube-api-access-dfs59\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.929454 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerStarted","Data":"f222edebfa344daa86bd8271ba2652c57ea88e910f88deaebda967c057b0df94"} Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.929502 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerStarted","Data":"f4231a3a80bdcc8109af38480c1ee25046828e36a9ec49ede0d5a3e380712a8a"} Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.933560 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerStarted","Data":"e1ca970452c8cabb1d309d39879d7d7b5e54a2c7bdb1068231e23bf1771b321e"} Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.935230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651e93ad-d51d-4bb1-91ef-0784eb952c71","Type":"ContainerDied","Data":"423f3d21ecd7f423782ea6f793fad5a89516db297c94463f4a0999c8c24fa78e"} Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.935282 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.935311 4903 scope.go:117] "RemoveContainer" containerID="1891386b971c2768f3fc1d46ca2d9121897fb2e9195767ba08771730db965725" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.952438 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.952418802 podStartE2EDuration="2.952418802s" podCreationTimestamp="2025-12-02 23:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:28.946390404 +0000 UTC m=+1307.654944717" watchObservedRunningTime="2025-12-02 23:19:28.952418802 +0000 UTC m=+1307.660973085" Dec 02 23:19:28 crc kubenswrapper[4903]: I1202 23:19:28.986028 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.000098 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.026360 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:29 crc kubenswrapper[4903]: E1202 23:19:29.027367 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerName="nova-scheduler-scheduler" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.027385 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerName="nova-scheduler-scheduler" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.028353 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" containerName="nova-scheduler-scheduler" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.037224 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.040577 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.079636 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.159761 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.159861 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.159915 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jmx\" (UniqueName: \"kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.262729 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.263166 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.263306 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jmx\" (UniqueName: \"kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.269844 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.271389 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.280634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jmx\" (UniqueName: \"kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx\") pod \"nova-scheduler-0\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.362905 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.636897 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651e93ad-d51d-4bb1-91ef-0784eb952c71" path="/var/lib/kubelet/pods/651e93ad-d51d-4bb1-91ef-0784eb952c71/volumes" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.881812 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.946672 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerStarted","Data":"39799a47c43f342a80ba54c49df78b1316f477952a28bcb11161c5ecb996c541"} Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.947840 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.949925 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac51dfe7-7ebc-4296-874e-5669f01d115a","Type":"ContainerStarted","Data":"f8614f1aab9662a1e08b6ac2131c4bc0c27234b946da5bc5d184a0b7b3228223"} Dec 02 23:19:29 crc kubenswrapper[4903]: I1202 23:19:29.974296 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.317528581 podStartE2EDuration="3.97427749s" podCreationTimestamp="2025-12-02 23:19:26 +0000 UTC" firstStartedPulling="2025-12-02 23:19:27.001159364 +0000 UTC m=+1305.709713647" lastFinishedPulling="2025-12-02 23:19:29.657908273 +0000 UTC m=+1308.366462556" observedRunningTime="2025-12-02 23:19:29.970757646 +0000 UTC m=+1308.679311929" watchObservedRunningTime="2025-12-02 23:19:29.97427749 +0000 UTC m=+1308.682831773" Dec 02 23:19:30 crc kubenswrapper[4903]: I1202 23:19:30.114908 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 23:19:30 crc kubenswrapper[4903]: I1202 23:19:30.231916 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 23:19:30 crc kubenswrapper[4903]: I1202 23:19:30.968046 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac51dfe7-7ebc-4296-874e-5669f01d115a","Type":"ContainerStarted","Data":"c3997569e826d69f5e39884db1394ef2592a998b8a9cc79407dc081c1ef0b1d0"} Dec 02 23:19:30 crc kubenswrapper[4903]: I1202 23:19:30.985251 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.985220055 podStartE2EDuration="2.985220055s" podCreationTimestamp="2025-12-02 23:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:30.980012374 +0000 UTC m=+1309.688566667" watchObservedRunningTime="2025-12-02 23:19:30.985220055 +0000 UTC m=+1309.693774358" Dec 02 23:19:31 crc kubenswrapper[4903]: I1202 23:19:31.978410 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerID="869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" exitCode=137 Dec 02 23:19:31 crc kubenswrapper[4903]: I1202 23:19:31.978480 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5"} Dec 02 23:19:31 crc kubenswrapper[4903]: I1202 23:19:31.978830 4903 scope.go:117] "RemoveContainer" containerID="5dee57eff07796a288e3f2bb7e5636eef4388a613cdaaa20bd3ad6d3bb553f8b" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.353631 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436436 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs\") pod \"6de4a117-0c91-47f4-a80d-278debb3ea60\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436594 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle\") pod \"6de4a117-0c91-47f4-a80d-278debb3ea60\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436618 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca\") pod \"6de4a117-0c91-47f4-a80d-278debb3ea60\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436791 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data\") pod \"6de4a117-0c91-47f4-a80d-278debb3ea60\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436836 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px9sh\" (UniqueName: \"kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh\") pod \"6de4a117-0c91-47f4-a80d-278debb3ea60\" (UID: \"6de4a117-0c91-47f4-a80d-278debb3ea60\") " Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.436903 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs" (OuterVolumeSpecName: "logs") pod "6de4a117-0c91-47f4-a80d-278debb3ea60" (UID: "6de4a117-0c91-47f4-a80d-278debb3ea60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.437553 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6de4a117-0c91-47f4-a80d-278debb3ea60-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.447824 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh" (OuterVolumeSpecName: "kube-api-access-px9sh") pod "6de4a117-0c91-47f4-a80d-278debb3ea60" (UID: "6de4a117-0c91-47f4-a80d-278debb3ea60"). InnerVolumeSpecName "kube-api-access-px9sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.474770 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6de4a117-0c91-47f4-a80d-278debb3ea60" (UID: "6de4a117-0c91-47f4-a80d-278debb3ea60"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.474936 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de4a117-0c91-47f4-a80d-278debb3ea60" (UID: "6de4a117-0c91-47f4-a80d-278debb3ea60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.502529 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data" (OuterVolumeSpecName: "config-data") pod "6de4a117-0c91-47f4-a80d-278debb3ea60" (UID: "6de4a117-0c91-47f4-a80d-278debb3ea60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.539325 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px9sh\" (UniqueName: \"kubernetes.io/projected/6de4a117-0c91-47f4-a80d-278debb3ea60-kube-api-access-px9sh\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.539362 4903 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.539371 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.539379 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4a117-0c91-47f4-a80d-278debb3ea60-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.987411 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6de4a117-0c91-47f4-a80d-278debb3ea60","Type":"ContainerDied","Data":"9642229d3deab857dde02054ecb841a3af38cf34195214e12a248261a22ea20b"} Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.987729 4903 scope.go:117] "RemoveContainer" containerID="869337a1d3594ce55a381d0bedbd91c0bd6e1cde2a42f0d7048097a4d90eb4d5" Dec 02 23:19:32 crc kubenswrapper[4903]: I1202 23:19:32.987838 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.021742 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.034476 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.045398 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:19:33 crc kubenswrapper[4903]: E1202 23:19:33.045831 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.045848 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: E1202 23:19:33.045871 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.045878 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: E1202 23:19:33.045888 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.045894 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.046094 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.046113 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.046122 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.046136 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.046820 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.049993 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.078736 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.156051 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48r5m\" (UniqueName: \"kubernetes.io/projected/d92eb92f-06d0-4676-9c0f-9f3e427ae019-kube-api-access-48r5m\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.156248 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.156279 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d92eb92f-06d0-4676-9c0f-9f3e427ae019-logs\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.156315 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.156338 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.257756 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.258033 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d92eb92f-06d0-4676-9c0f-9f3e427ae019-logs\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.258234 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.258396 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.259118 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48r5m\" (UniqueName: \"kubernetes.io/projected/d92eb92f-06d0-4676-9c0f-9f3e427ae019-kube-api-access-48r5m\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.258879 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d92eb92f-06d0-4676-9c0f-9f3e427ae019-logs\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.262633 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.270546 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.279398 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d92eb92f-06d0-4676-9c0f-9f3e427ae019-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.299199 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48r5m\" (UniqueName: \"kubernetes.io/projected/d92eb92f-06d0-4676-9c0f-9f3e427ae019-kube-api-access-48r5m\") pod \"watcher-decision-engine-0\" (UID: \"d92eb92f-06d0-4676-9c0f-9f3e427ae019\") " pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.377098 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.628871 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" path="/var/lib/kubelet/pods/6de4a117-0c91-47f4-a80d-278debb3ea60/volumes" Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.868188 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 23:19:33 crc kubenswrapper[4903]: W1202 23:19:33.873853 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd92eb92f_06d0_4676_9c0f_9f3e427ae019.slice/crio-13d7f8123b63d9ba0961cb516c74397f1ce443967225acd9c9e9b520c337e847 WatchSource:0}: Error finding container 13d7f8123b63d9ba0961cb516c74397f1ce443967225acd9c9e9b520c337e847: Status 404 returned error can't find the container with id 13d7f8123b63d9ba0961cb516c74397f1ce443967225acd9c9e9b520c337e847 Dec 02 23:19:33 crc kubenswrapper[4903]: I1202 23:19:33.997028 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"d92eb92f-06d0-4676-9c0f-9f3e427ae019","Type":"ContainerStarted","Data":"13d7f8123b63d9ba0961cb516c74397f1ce443967225acd9c9e9b520c337e847"} Dec 02 23:19:34 crc kubenswrapper[4903]: I1202 23:19:34.363502 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:19:35 crc kubenswrapper[4903]: I1202 23:19:35.034790 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"d92eb92f-06d0-4676-9c0f-9f3e427ae019","Type":"ContainerStarted","Data":"92366212d7aa30b8f4f6439785a245e8a7e92d8de947b6379060f39b9f96bd46"} Dec 02 23:19:35 crc kubenswrapper[4903]: I1202 23:19:35.057719 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.057695017 podStartE2EDuration="2.057695017s" podCreationTimestamp="2025-12-02 23:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:35.050706139 +0000 UTC m=+1313.759260452" watchObservedRunningTime="2025-12-02 23:19:35.057695017 +0000 UTC m=+1313.766249310" Dec 02 23:19:37 crc kubenswrapper[4903]: I1202 23:19:37.288483 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:19:37 crc kubenswrapper[4903]: I1202 23:19:37.289220 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:19:38 crc kubenswrapper[4903]: I1202 23:19:38.334230 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:38 crc kubenswrapper[4903]: I1202 23:19:38.374902 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:39 crc kubenswrapper[4903]: I1202 23:19:39.364598 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:19:39 crc kubenswrapper[4903]: I1202 23:19:39.410888 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:19:40 crc kubenswrapper[4903]: I1202 23:19:40.140202 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:19:43 crc kubenswrapper[4903]: I1202 23:19:43.377376 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:43 crc kubenswrapper[4903]: I1202 23:19:43.410071 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:44 crc kubenswrapper[4903]: I1202 23:19:44.156774 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:44 crc kubenswrapper[4903]: I1202 23:19:44.198173 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.095803 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.103226 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125109 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle\") pod \"028f476a-ab2c-424d-aeda-70903e28d17d\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125154 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tznrm\" (UniqueName: \"kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm\") pod \"028f476a-ab2c-424d-aeda-70903e28d17d\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125185 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfst\" (UniqueName: \"kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst\") pod \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125260 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle\") pod \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125284 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data\") pod \"028f476a-ab2c-424d-aeda-70903e28d17d\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125344 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data\") pod \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\" (UID: \"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.125399 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs\") pod \"028f476a-ab2c-424d-aeda-70903e28d17d\" (UID: \"028f476a-ab2c-424d-aeda-70903e28d17d\") " Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.126117 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs" (OuterVolumeSpecName: "logs") pod "028f476a-ab2c-424d-aeda-70903e28d17d" (UID: "028f476a-ab2c-424d-aeda-70903e28d17d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.132772 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm" (OuterVolumeSpecName: "kube-api-access-tznrm") pod "028f476a-ab2c-424d-aeda-70903e28d17d" (UID: "028f476a-ab2c-424d-aeda-70903e28d17d"). InnerVolumeSpecName "kube-api-access-tznrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.132957 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst" (OuterVolumeSpecName: "kube-api-access-bpfst") pod "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" (UID: "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e"). InnerVolumeSpecName "kube-api-access-bpfst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.163974 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data" (OuterVolumeSpecName: "config-data") pod "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" (UID: "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.169321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data" (OuterVolumeSpecName: "config-data") pod "028f476a-ab2c-424d-aeda-70903e28d17d" (UID: "028f476a-ab2c-424d-aeda-70903e28d17d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.181796 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" (UID: "2d896453-a1aa-4cfe-aa9b-03de9d6eb83e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.186514 4903 generic.go:334] "Generic (PLEG): container finished" podID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" containerID="0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb" exitCode=137 Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.186576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e","Type":"ContainerDied","Data":"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb"} Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.186833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2d896453-a1aa-4cfe-aa9b-03de9d6eb83e","Type":"ContainerDied","Data":"faa5154357b84ca6a22cd957e36a12ce0a1063537081786e1da99abef2a2d194"} Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.186874 4903 scope.go:117] "RemoveContainer" containerID="0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.188793 4903 generic.go:334] "Generic (PLEG): container finished" podID="028f476a-ab2c-424d-aeda-70903e28d17d" containerID="0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce" exitCode=137 Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.188812 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerDied","Data":"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce"} Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.188975 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"028f476a-ab2c-424d-aeda-70903e28d17d","Type":"ContainerDied","Data":"bfb1fccc4c1997e4133fb4d81f80146a4dd14cab6ab0180b81ae5a7d8423f0fa"} Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.188914 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.189153 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.223810 4903 scope.go:117] "RemoveContainer" containerID="0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.225105 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "028f476a-ab2c-424d-aeda-70903e28d17d" (UID: "028f476a-ab2c-424d-aeda-70903e28d17d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.229581 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb\": container with ID starting with 0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb not found: ID does not exist" containerID="0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.229635 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb"} err="failed to get container status \"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb\": rpc error: code = NotFound desc = could not find container \"0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb\": container with ID starting with 0bce510514508ad67c4c18a6ae59b1bbcd1e17dfd434d1615362db805334bbfb not found: ID does not exist" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.229695 4903 scope.go:117] "RemoveContainer" containerID="0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230151 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230181 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tznrm\" (UniqueName: \"kubernetes.io/projected/028f476a-ab2c-424d-aeda-70903e28d17d-kube-api-access-tznrm\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230190 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfst\" (UniqueName: \"kubernetes.io/projected/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-kube-api-access-bpfst\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230205 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230219 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f476a-ab2c-424d-aeda-70903e28d17d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230231 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.230240 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f476a-ab2c-424d-aeda-70903e28d17d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.250964 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.268549 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277136 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.277544 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-metadata" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277558 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-metadata" Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.277576 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277582 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.277592 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277598 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.277609 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-log" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277615 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-log" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277850 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277866 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277882 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-log" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.277897 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" containerName="nova-metadata-metadata" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.278541 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.280822 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.281069 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.281183 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.284165 4903 scope.go:117] "RemoveContainer" containerID="03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.324380 4903 scope.go:117] "RemoveContainer" containerID="0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.324550 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.324956 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce\": container with ID starting with 0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce not found: ID does not exist" containerID="0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.325024 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce"} err="failed to get container status \"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce\": rpc error: code = NotFound desc = could not find container \"0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce\": container with ID starting with 0967568d5753efb623bb71f13ba3aee464ab907cb7257bf8ae9e3adc0817d9ce not found: ID does not exist" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.325057 4903 scope.go:117] "RemoveContainer" containerID="03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80" Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.325514 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80\": container with ID starting with 03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80 not found: ID does not exist" containerID="03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.325611 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80"} err="failed to get container status \"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80\": rpc error: code = NotFound desc = could not find container \"03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80\": container with ID starting with 03630a66672c5422a3192b948a9c7bca483ca08da446a0a7fb3b3cd00ce20e80 not found: ID does not exist" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.336399 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbl7\" (UniqueName: \"kubernetes.io/projected/59776a3d-ba94-467b-9b25-2391269821e3-kube-api-access-7lbl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.337017 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.337103 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.337438 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.337531 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.438923 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.438993 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbl7\" (UniqueName: \"kubernetes.io/projected/59776a3d-ba94-467b-9b25-2391269821e3-kube-api-access-7lbl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.439072 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.439092 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.439139 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.442897 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.443690 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.443914 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.444257 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/59776a3d-ba94-467b-9b25-2391269821e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.455619 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbl7\" (UniqueName: \"kubernetes.io/projected/59776a3d-ba94-467b-9b25-2391269821e3-kube-api-access-7lbl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"59776a3d-ba94-467b-9b25-2391269821e3\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.525571 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.541781 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.556706 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: E1202 23:19:46.557209 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.557235 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4a117-0c91-47f4-a80d-278debb3ea60" containerName="watcher-decision-engine" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.558910 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.566511 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.566768 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.569236 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.614130 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.643991 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.644082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.644180 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.644270 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.644295 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcqr\" (UniqueName: \"kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.745433 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.745893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.745952 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.745976 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcqr\" (UniqueName: \"kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.746027 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.746559 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.752186 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.754857 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.755382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.772504 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcqr\" (UniqueName: \"kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr\") pod \"nova-metadata-0\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " pod="openstack/nova-metadata-0" Dec 02 23:19:46 crc kubenswrapper[4903]: I1202 23:19:46.882991 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.064703 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 23:19:47 crc kubenswrapper[4903]: W1202 23:19:47.067314 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59776a3d_ba94_467b_9b25_2391269821e3.slice/crio-ef0c5578632414492a659a12149d71cf111becab6789c1624067e523cb0a5277 WatchSource:0}: Error finding container ef0c5578632414492a659a12149d71cf111becab6789c1624067e523cb0a5277: Status 404 returned error can't find the container with id ef0c5578632414492a659a12149d71cf111becab6789c1624067e523cb0a5277 Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.201474 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59776a3d-ba94-467b-9b25-2391269821e3","Type":"ContainerStarted","Data":"ef0c5578632414492a659a12149d71cf111becab6789c1624067e523cb0a5277"} Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.294801 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.295488 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.297303 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.304167 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:19:47 crc kubenswrapper[4903]: W1202 23:19:47.345688 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5c694f3_bd40_44e8_b076_e0d20b840329.slice/crio-9df882503105b9bbe47404bef3fe35b1b88889529fbf655f359cef12b4d38f32 WatchSource:0}: Error finding container 9df882503105b9bbe47404bef3fe35b1b88889529fbf655f359cef12b4d38f32: Status 404 returned error can't find the container with id 9df882503105b9bbe47404bef3fe35b1b88889529fbf655f359cef12b4d38f32 Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.347424 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.623032 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028f476a-ab2c-424d-aeda-70903e28d17d" path="/var/lib/kubelet/pods/028f476a-ab2c-424d-aeda-70903e28d17d/volumes" Dec 02 23:19:47 crc kubenswrapper[4903]: I1202 23:19:47.623601 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d896453-a1aa-4cfe-aa9b-03de9d6eb83e" path="/var/lib/kubelet/pods/2d896453-a1aa-4cfe-aa9b-03de9d6eb83e/volumes" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.219338 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerStarted","Data":"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc"} Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.219821 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerStarted","Data":"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287"} Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.219861 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerStarted","Data":"9df882503105b9bbe47404bef3fe35b1b88889529fbf655f359cef12b4d38f32"} Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.224029 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"59776a3d-ba94-467b-9b25-2391269821e3","Type":"ContainerStarted","Data":"732c3fc4bf4422ef19f4764d11e75a66fa2b7edc85c305ba5c9d2f914b754fda"} Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.224300 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.231297 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.252284 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.252259869 podStartE2EDuration="2.252259869s" podCreationTimestamp="2025-12-02 23:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:48.247817624 +0000 UTC m=+1326.956371937" watchObservedRunningTime="2025-12-02 23:19:48.252259869 +0000 UTC m=+1326.960814182" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.273553 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.273528621 podStartE2EDuration="2.273528621s" podCreationTimestamp="2025-12-02 23:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:48.265414088 +0000 UTC m=+1326.973968381" watchObservedRunningTime="2025-12-02 23:19:48.273528621 +0000 UTC m=+1326.982082904" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.492577 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.494293 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.506447 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.695278 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.695511 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.695811 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.695876 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.695940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.696046 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrv2r\" (UniqueName: \"kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.798021 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrv2r\" (UniqueName: \"kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.799125 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.800193 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.800803 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.800140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.800950 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.801044 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.800813 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.801479 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.801739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.802275 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:48 crc kubenswrapper[4903]: I1202 23:19:48.819402 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrv2r\" (UniqueName: \"kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r\") pod \"dnsmasq-dns-54f6dc465-fs5zz\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:49 crc kubenswrapper[4903]: I1202 23:19:49.115722 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:49 crc kubenswrapper[4903]: I1202 23:19:49.648301 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.283364 4903 generic.go:334] "Generic (PLEG): container finished" podID="066404b9-1803-4886-95b3-b5d9f850f388" containerID="1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56" exitCode=0 Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.284583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" event={"ID":"066404b9-1803-4886-95b3-b5d9f850f388","Type":"ContainerDied","Data":"1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56"} Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.284622 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" event={"ID":"066404b9-1803-4886-95b3-b5d9f850f388","Type":"ContainerStarted","Data":"8aef4671ff4233acaf75334c1a83a05c577874afa973c3e0c78e4eb1f177dafe"} Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.836475 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.971993 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.972287 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-central-agent" containerID="cri-o://136b4132531bbed69fd27f195113f8c40a08026447ce5fc23cbfa6d7cad84e03" gracePeriod=30 Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.972402 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="proxy-httpd" containerID="cri-o://39799a47c43f342a80ba54c49df78b1316f477952a28bcb11161c5ecb996c541" gracePeriod=30 Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.972450 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="sg-core" containerID="cri-o://e1ca970452c8cabb1d309d39879d7d7b5e54a2c7bdb1068231e23bf1771b321e" gracePeriod=30 Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.972484 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-notification-agent" containerID="cri-o://1c45e13c204868184c4147582782723d0539383f7e1969913b4e4ae26849f05f" gracePeriod=30 Dec 02 23:19:50 crc kubenswrapper[4903]: I1202 23:19:50.982792 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.210:3000/\": EOF" Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.295797 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" event={"ID":"066404b9-1803-4886-95b3-b5d9f850f388","Type":"ContainerStarted","Data":"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957"} Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.295944 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.299736 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerID="39799a47c43f342a80ba54c49df78b1316f477952a28bcb11161c5ecb996c541" exitCode=0 Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.299794 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerID="e1ca970452c8cabb1d309d39879d7d7b5e54a2c7bdb1068231e23bf1771b321e" exitCode=2 Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.299829 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerDied","Data":"39799a47c43f342a80ba54c49df78b1316f477952a28bcb11161c5ecb996c541"} Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.299871 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerDied","Data":"e1ca970452c8cabb1d309d39879d7d7b5e54a2c7bdb1068231e23bf1771b321e"} Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.300056 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-log" containerID="cri-o://f4231a3a80bdcc8109af38480c1ee25046828e36a9ec49ede0d5a3e380712a8a" gracePeriod=30 Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.300082 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-api" containerID="cri-o://f222edebfa344daa86bd8271ba2652c57ea88e910f88deaebda967c057b0df94" gracePeriod=30 Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.331308 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" podStartSLOduration=3.331284437 podStartE2EDuration="3.331284437s" podCreationTimestamp="2025-12-02 23:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:51.314337827 +0000 UTC m=+1330.022892140" watchObservedRunningTime="2025-12-02 23:19:51.331284437 +0000 UTC m=+1330.039838740" Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.626178 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.883513 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:19:51 crc kubenswrapper[4903]: I1202 23:19:51.883569 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.311745 4903 generic.go:334] "Generic (PLEG): container finished" podID="9245f806-00cf-493f-a896-9f494b3e27a5" containerID="f222edebfa344daa86bd8271ba2652c57ea88e910f88deaebda967c057b0df94" exitCode=0 Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.311783 4903 generic.go:334] "Generic (PLEG): container finished" podID="9245f806-00cf-493f-a896-9f494b3e27a5" containerID="f4231a3a80bdcc8109af38480c1ee25046828e36a9ec49ede0d5a3e380712a8a" exitCode=143 Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.311848 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerDied","Data":"f222edebfa344daa86bd8271ba2652c57ea88e910f88deaebda967c057b0df94"} Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.311877 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerDied","Data":"f4231a3a80bdcc8109af38480c1ee25046828e36a9ec49ede0d5a3e380712a8a"} Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.314929 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerID="136b4132531bbed69fd27f195113f8c40a08026447ce5fc23cbfa6d7cad84e03" exitCode=0 Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.315894 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerDied","Data":"136b4132531bbed69fd27f195113f8c40a08026447ce5fc23cbfa6d7cad84e03"} Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.651638 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.682453 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skkjs\" (UniqueName: \"kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs\") pod \"9245f806-00cf-493f-a896-9f494b3e27a5\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.682503 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs\") pod \"9245f806-00cf-493f-a896-9f494b3e27a5\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.682522 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle\") pod \"9245f806-00cf-493f-a896-9f494b3e27a5\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.682549 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data\") pod \"9245f806-00cf-493f-a896-9f494b3e27a5\" (UID: \"9245f806-00cf-493f-a896-9f494b3e27a5\") " Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.685063 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs" (OuterVolumeSpecName: "logs") pod "9245f806-00cf-493f-a896-9f494b3e27a5" (UID: "9245f806-00cf-493f-a896-9f494b3e27a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.690181 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs" (OuterVolumeSpecName: "kube-api-access-skkjs") pod "9245f806-00cf-493f-a896-9f494b3e27a5" (UID: "9245f806-00cf-493f-a896-9f494b3e27a5"). InnerVolumeSpecName "kube-api-access-skkjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.727811 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data" (OuterVolumeSpecName: "config-data") pod "9245f806-00cf-493f-a896-9f494b3e27a5" (UID: "9245f806-00cf-493f-a896-9f494b3e27a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.759821 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9245f806-00cf-493f-a896-9f494b3e27a5" (UID: "9245f806-00cf-493f-a896-9f494b3e27a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.785364 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.785413 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skkjs\" (UniqueName: \"kubernetes.io/projected/9245f806-00cf-493f-a896-9f494b3e27a5-kube-api-access-skkjs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.785425 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9245f806-00cf-493f-a896-9f494b3e27a5-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:52 crc kubenswrapper[4903]: I1202 23:19:52.785434 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9245f806-00cf-493f-a896-9f494b3e27a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.070442 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.070554 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.330669 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9245f806-00cf-493f-a896-9f494b3e27a5","Type":"ContainerDied","Data":"861e78f6bbea14862685faee8c36370b9639fcd2647d1e2d22b6793651ec032b"} Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.331159 4903 scope.go:117] "RemoveContainer" containerID="f222edebfa344daa86bd8271ba2652c57ea88e910f88deaebda967c057b0df94" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.330813 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.365976 4903 scope.go:117] "RemoveContainer" containerID="f4231a3a80bdcc8109af38480c1ee25046828e36a9ec49ede0d5a3e380712a8a" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.386779 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.400206 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.423234 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:53 crc kubenswrapper[4903]: E1202 23:19:53.424407 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-log" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.424429 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-log" Dec 02 23:19:53 crc kubenswrapper[4903]: E1202 23:19:53.424495 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-api" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.424502 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-api" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.424782 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-api" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.424803 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" containerName="nova-api-log" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.426098 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.428074 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.428581 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.433453 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.433675 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599029 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599124 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599151 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599181 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.599351 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgbfn\" (UniqueName: \"kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.622789 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9245f806-00cf-493f-a896-9f494b3e27a5" path="/var/lib/kubelet/pods/9245f806-00cf-493f-a896-9f494b3e27a5/volumes" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701089 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701149 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgbfn\" (UniqueName: \"kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701209 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701271 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701309 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.701729 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.710198 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.710336 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.710398 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.710414 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.722411 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgbfn\" (UniqueName: \"kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn\") pod \"nova-api-0\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " pod="openstack/nova-api-0" Dec 02 23:19:53 crc kubenswrapper[4903]: I1202 23:19:53.745379 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:19:54 crc kubenswrapper[4903]: I1202 23:19:54.178468 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:19:54 crc kubenswrapper[4903]: W1202 23:19:54.192056 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14f1d8c_e6c3_450a_a1ad_d580a33db0a4.slice/crio-23339711baba396955d36d2d7cbfc23a2eab0441d71c927971c4c44965bcac43 WatchSource:0}: Error finding container 23339711baba396955d36d2d7cbfc23a2eab0441d71c927971c4c44965bcac43: Status 404 returned error can't find the container with id 23339711baba396955d36d2d7cbfc23a2eab0441d71c927971c4c44965bcac43 Dec 02 23:19:54 crc kubenswrapper[4903]: I1202 23:19:54.346527 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerStarted","Data":"23339711baba396955d36d2d7cbfc23a2eab0441d71c927971c4c44965bcac43"} Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.361136 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerID="1c45e13c204868184c4147582782723d0539383f7e1969913b4e4ae26849f05f" exitCode=0 Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.361193 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerDied","Data":"1c45e13c204868184c4147582782723d0539383f7e1969913b4e4ae26849f05f"} Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.361222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c927ae4-891c-442d-8aab-d04ae025dc57","Type":"ContainerDied","Data":"f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908"} Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.361233 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2667cc77c484e8c24c3948c16305b62ca24f7962861a5c8681115251157b908" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.363000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerStarted","Data":"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7"} Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.363023 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerStarted","Data":"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d"} Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.381233 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.388473 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.388454233 podStartE2EDuration="2.388454233s" podCreationTimestamp="2025-12-02 23:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:55.380523135 +0000 UTC m=+1334.089077418" watchObservedRunningTime="2025-12-02 23:19:55.388454233 +0000 UTC m=+1334.097008516" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.551803 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552172 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552205 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552305 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552425 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552505 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67j7j\" (UniqueName: \"kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.552559 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts\") pod \"9c927ae4-891c-442d-8aab-d04ae025dc57\" (UID: \"9c927ae4-891c-442d-8aab-d04ae025dc57\") " Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.554223 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.554756 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.560798 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts" (OuterVolumeSpecName: "scripts") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.574993 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j" (OuterVolumeSpecName: "kube-api-access-67j7j") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "kube-api-access-67j7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.589966 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.656212 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.656273 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.656289 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c927ae4-891c-442d-8aab-d04ae025dc57-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.656302 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67j7j\" (UniqueName: \"kubernetes.io/projected/9c927ae4-891c-442d-8aab-d04ae025dc57-kube-api-access-67j7j\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.656319 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.672619 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.699370 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.700320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data" (OuterVolumeSpecName: "config-data") pod "9c927ae4-891c-442d-8aab-d04ae025dc57" (UID: "9c927ae4-891c-442d-8aab-d04ae025dc57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.758305 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.758345 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:55 crc kubenswrapper[4903]: I1202 23:19:55.758357 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c927ae4-891c-442d-8aab-d04ae025dc57-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.376235 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.444084 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.477989 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.494069 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:56 crc kubenswrapper[4903]: E1202 23:19:56.494815 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-notification-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.494853 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-notification-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: E1202 23:19:56.494877 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="proxy-httpd" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.494890 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="proxy-httpd" Dec 02 23:19:56 crc kubenswrapper[4903]: E1202 23:19:56.494930 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="sg-core" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.494943 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="sg-core" Dec 02 23:19:56 crc kubenswrapper[4903]: E1202 23:19:56.494985 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-central-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.495033 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-central-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.495491 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="proxy-httpd" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.495543 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-central-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.495571 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="ceilometer-notification-agent" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.495595 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" containerName="sg-core" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.500832 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.503537 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.504445 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.504706 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.506273 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.614415 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.642000 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.677691 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-run-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.677793 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678077 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sksv\" (UniqueName: \"kubernetes.io/projected/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-kube-api-access-8sksv\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678225 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-scripts\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678332 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-log-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678507 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-config-data\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678576 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.678734 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.780775 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-config-data\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.780882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.780980 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.781183 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-run-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.781230 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.781389 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sksv\" (UniqueName: \"kubernetes.io/projected/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-kube-api-access-8sksv\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.781452 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-scripts\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.781507 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-log-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.782288 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-log-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.783391 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-run-httpd\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.789144 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-config-data\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.791394 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.791812 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.800342 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-scripts\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.804117 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.826688 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sksv\" (UniqueName: \"kubernetes.io/projected/ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab-kube-api-access-8sksv\") pod \"ceilometer-0\" (UID: \"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab\") " pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.835894 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.883672 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:19:56 crc kubenswrapper[4903]: I1202 23:19:56.883949 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.349648 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.386408 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab","Type":"ContainerStarted","Data":"19c975427fb83bbf2bc78139b149558a646526d320537783e279fa9f0d71edae"} Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.412568 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.560915 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-prs2t"] Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.562435 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.566755 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.566829 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.567734 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prs2t"] Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.633507 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c927ae4-891c-442d-8aab-d04ae025dc57" path="/var/lib/kubelet/pods/9c927ae4-891c-442d-8aab-d04ae025dc57/volumes" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.701831 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwskl\" (UniqueName: \"kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.702066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.702251 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.702343 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.805075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.805111 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwskl\" (UniqueName: \"kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.805213 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.805240 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.813412 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.817221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.817378 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.830419 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwskl\" (UniqueName: \"kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl\") pod \"nova-cell1-cell-mapping-prs2t\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.897982 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.898001 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:19:57 crc kubenswrapper[4903]: I1202 23:19:57.905933 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:19:58 crc kubenswrapper[4903]: I1202 23:19:58.401134 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab","Type":"ContainerStarted","Data":"ba2bc39a835a3be52b510bdc2510126693f676e780a244de965ac937cb2e7818"} Dec 02 23:19:58 crc kubenswrapper[4903]: I1202 23:19:58.401546 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab","Type":"ContainerStarted","Data":"aafd772da26ef9ec6a3012144980439cc8d90619f62527a2446b67a14675d110"} Dec 02 23:19:58 crc kubenswrapper[4903]: I1202 23:19:58.501474 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prs2t"] Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.117854 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.188322 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.188754 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="dnsmasq-dns" containerID="cri-o://158115072b95e7b226b9d6babda6213ce44b89d3090d21a48e08d8217cdfdbc0" gracePeriod=10 Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.430096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prs2t" event={"ID":"b3ad3f2c-68c0-426e-94b6-999ea0629dcd","Type":"ContainerStarted","Data":"a8eef5c453b8d96bd3fb8a2612d283b49ee55cd45a6259efe3efae85c2dc3ab4"} Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.430136 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prs2t" event={"ID":"b3ad3f2c-68c0-426e-94b6-999ea0629dcd","Type":"ContainerStarted","Data":"ef61c88ff62e54e0b568162efee67ce260dc509f5ec50596faa51abaa02f4d5d"} Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.433988 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab","Type":"ContainerStarted","Data":"8c81f363a9f09a4971ebd3c942cb19816e052def118231a1999a667835e0c0e2"} Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.438077 4903 generic.go:334] "Generic (PLEG): container finished" podID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerID="158115072b95e7b226b9d6babda6213ce44b89d3090d21a48e08d8217cdfdbc0" exitCode=0 Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.438106 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" event={"ID":"aa730e48-0dda-48df-9675-d7b3fa3358d1","Type":"ContainerDied","Data":"158115072b95e7b226b9d6babda6213ce44b89d3090d21a48e08d8217cdfdbc0"} Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.456961 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-prs2t" podStartSLOduration=2.45694267 podStartE2EDuration="2.45694267s" podCreationTimestamp="2025-12-02 23:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:19:59.448723555 +0000 UTC m=+1338.157277838" watchObservedRunningTime="2025-12-02 23:19:59.45694267 +0000 UTC m=+1338.165496953" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.770629 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855430 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855533 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855599 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855625 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855685 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.855794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tnzd\" (UniqueName: \"kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd\") pod \"aa730e48-0dda-48df-9675-d7b3fa3358d1\" (UID: \"aa730e48-0dda-48df-9675-d7b3fa3358d1\") " Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.877009 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd" (OuterVolumeSpecName: "kube-api-access-2tnzd") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "kube-api-access-2tnzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.942532 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config" (OuterVolumeSpecName: "config") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.967483 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.970842 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.970867 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tnzd\" (UniqueName: \"kubernetes.io/projected/aa730e48-0dda-48df-9675-d7b3fa3358d1-kube-api-access-2tnzd\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.970879 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.990112 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:19:59 crc kubenswrapper[4903]: I1202 23:19:59.998124 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:19:59.999957 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa730e48-0dda-48df-9675-d7b3fa3358d1" (UID: "aa730e48-0dda-48df-9675-d7b3fa3358d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.073210 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.073256 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.073269 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa730e48-0dda-48df-9675-d7b3fa3358d1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.455546 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" event={"ID":"aa730e48-0dda-48df-9675-d7b3fa3358d1","Type":"ContainerDied","Data":"536cdde2bb74c9269fd3e87e1dc2956139024939ad82b13dfedcc8a67458c549"} Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.456038 4903 scope.go:117] "RemoveContainer" containerID="158115072b95e7b226b9d6babda6213ce44b89d3090d21a48e08d8217cdfdbc0" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.455613 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bfcddf47-tkg4k" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.461273 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab","Type":"ContainerStarted","Data":"f538041567c43b00ae4a341ae1643a89335075b904bb2c1772cfa6785ba986c8"} Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.495773 4903 scope.go:117] "RemoveContainer" containerID="a716027066d338e2e3b6141e352f3e9824b2ee70d8d13230e5432be890cf11f2" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.518749 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.068024287 podStartE2EDuration="4.518728186s" podCreationTimestamp="2025-12-02 23:19:56 +0000 UTC" firstStartedPulling="2025-12-02 23:19:57.340805245 +0000 UTC m=+1336.049359528" lastFinishedPulling="2025-12-02 23:19:59.791509144 +0000 UTC m=+1338.500063427" observedRunningTime="2025-12-02 23:20:00.50809042 +0000 UTC m=+1339.216644713" watchObservedRunningTime="2025-12-02 23:20:00.518728186 +0000 UTC m=+1339.227282469" Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.545852 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:20:00 crc kubenswrapper[4903]: I1202 23:20:00.556262 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74bfcddf47-tkg4k"] Dec 02 23:20:01 crc kubenswrapper[4903]: I1202 23:20:01.473627 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 23:20:01 crc kubenswrapper[4903]: I1202 23:20:01.632257 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" path="/var/lib/kubelet/pods/aa730e48-0dda-48df-9675-d7b3fa3358d1/volumes" Dec 02 23:20:03 crc kubenswrapper[4903]: I1202 23:20:03.746102 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:20:03 crc kubenswrapper[4903]: I1202 23:20:03.747533 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:20:04 crc kubenswrapper[4903]: I1202 23:20:04.516011 4903 generic.go:334] "Generic (PLEG): container finished" podID="b3ad3f2c-68c0-426e-94b6-999ea0629dcd" containerID="a8eef5c453b8d96bd3fb8a2612d283b49ee55cd45a6259efe3efae85c2dc3ab4" exitCode=0 Dec 02 23:20:04 crc kubenswrapper[4903]: I1202 23:20:04.516478 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prs2t" event={"ID":"b3ad3f2c-68c0-426e-94b6-999ea0629dcd","Type":"ContainerDied","Data":"a8eef5c453b8d96bd3fb8a2612d283b49ee55cd45a6259efe3efae85c2dc3ab4"} Dec 02 23:20:04 crc kubenswrapper[4903]: I1202 23:20:04.764068 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:04 crc kubenswrapper[4903]: I1202 23:20:04.764527 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:05 crc kubenswrapper[4903]: I1202 23:20:05.914798 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.005242 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts\") pod \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.005306 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwskl\" (UniqueName: \"kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl\") pod \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.005444 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data\") pod \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.005617 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle\") pod \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\" (UID: \"b3ad3f2c-68c0-426e-94b6-999ea0629dcd\") " Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.010974 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl" (OuterVolumeSpecName: "kube-api-access-rwskl") pod "b3ad3f2c-68c0-426e-94b6-999ea0629dcd" (UID: "b3ad3f2c-68c0-426e-94b6-999ea0629dcd"). InnerVolumeSpecName "kube-api-access-rwskl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.017894 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts" (OuterVolumeSpecName: "scripts") pod "b3ad3f2c-68c0-426e-94b6-999ea0629dcd" (UID: "b3ad3f2c-68c0-426e-94b6-999ea0629dcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.037499 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ad3f2c-68c0-426e-94b6-999ea0629dcd" (UID: "b3ad3f2c-68c0-426e-94b6-999ea0629dcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.044857 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data" (OuterVolumeSpecName: "config-data") pod "b3ad3f2c-68c0-426e-94b6-999ea0629dcd" (UID: "b3ad3f2c-68c0-426e-94b6-999ea0629dcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.120433 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.120488 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.120503 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.120513 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwskl\" (UniqueName: \"kubernetes.io/projected/b3ad3f2c-68c0-426e-94b6-999ea0629dcd-kube-api-access-rwskl\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.542598 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prs2t" event={"ID":"b3ad3f2c-68c0-426e-94b6-999ea0629dcd","Type":"ContainerDied","Data":"ef61c88ff62e54e0b568162efee67ce260dc509f5ec50596faa51abaa02f4d5d"} Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.542728 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef61c88ff62e54e0b568162efee67ce260dc509f5ec50596faa51abaa02f4d5d" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.542818 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prs2t" Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.726325 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.726891 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-log" containerID="cri-o://16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d" gracePeriod=30 Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.727340 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-api" containerID="cri-o://dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7" gracePeriod=30 Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.737768 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.737991 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ac51dfe7-7ebc-4296-874e-5669f01d115a" containerName="nova-scheduler-scheduler" containerID="cri-o://c3997569e826d69f5e39884db1394ef2592a998b8a9cc79407dc081c1ef0b1d0" gracePeriod=30 Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.753899 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.754146 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-log" containerID="cri-o://0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287" gracePeriod=30 Dec 02 23:20:06 crc kubenswrapper[4903]: I1202 23:20:06.754572 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-metadata" containerID="cri-o://f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc" gracePeriod=30 Dec 02 23:20:07 crc kubenswrapper[4903]: I1202 23:20:07.551374 4903 generic.go:334] "Generic (PLEG): container finished" podID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerID="16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d" exitCode=143 Dec 02 23:20:07 crc kubenswrapper[4903]: I1202 23:20:07.551432 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerDied","Data":"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d"} Dec 02 23:20:07 crc kubenswrapper[4903]: I1202 23:20:07.553000 4903 generic.go:334] "Generic (PLEG): container finished" podID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerID="0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287" exitCode=143 Dec 02 23:20:07 crc kubenswrapper[4903]: I1202 23:20:07.553036 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerDied","Data":"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.178174 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.183291 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264217 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264376 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgbfn\" (UniqueName: \"kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264396 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264570 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.264683 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs\") pod \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\" (UID: \"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.265508 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs" (OuterVolumeSpecName: "logs") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.265734 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.291774 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn" (OuterVolumeSpecName: "kube-api-access-qgbfn") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "kube-api-access-qgbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.299100 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.302867 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data" (OuterVolumeSpecName: "config-data") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.326391 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.327536 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" (UID: "c14f1d8c-e6c3-450a-a1ad-d580a33db0a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.366593 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle\") pod \"e5c694f3-bd40-44e8-b076-e0d20b840329\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.366686 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs\") pod \"e5c694f3-bd40-44e8-b076-e0d20b840329\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.366734 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xcqr\" (UniqueName: \"kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr\") pod \"e5c694f3-bd40-44e8-b076-e0d20b840329\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.366815 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data\") pod \"e5c694f3-bd40-44e8-b076-e0d20b840329\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.366842 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs\") pod \"e5c694f3-bd40-44e8-b076-e0d20b840329\" (UID: \"e5c694f3-bd40-44e8-b076-e0d20b840329\") " Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs" (OuterVolumeSpecName: "logs") pod "e5c694f3-bd40-44e8-b076-e0d20b840329" (UID: "e5c694f3-bd40-44e8-b076-e0d20b840329"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367586 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367605 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367615 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgbfn\" (UniqueName: \"kubernetes.io/projected/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-kube-api-access-qgbfn\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367626 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367637 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c694f3-bd40-44e8-b076-e0d20b840329-logs\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.367645 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.370843 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr" (OuterVolumeSpecName: "kube-api-access-4xcqr") pod "e5c694f3-bd40-44e8-b076-e0d20b840329" (UID: "e5c694f3-bd40-44e8-b076-e0d20b840329"). InnerVolumeSpecName "kube-api-access-4xcqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.394340 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data" (OuterVolumeSpecName: "config-data") pod "e5c694f3-bd40-44e8-b076-e0d20b840329" (UID: "e5c694f3-bd40-44e8-b076-e0d20b840329"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.408394 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5c694f3-bd40-44e8-b076-e0d20b840329" (UID: "e5c694f3-bd40-44e8-b076-e0d20b840329"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.426347 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e5c694f3-bd40-44e8-b076-e0d20b840329" (UID: "e5c694f3-bd40-44e8-b076-e0d20b840329"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.469069 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.469096 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xcqr\" (UniqueName: \"kubernetes.io/projected/e5c694f3-bd40-44e8-b076-e0d20b840329-kube-api-access-4xcqr\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.469107 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.469115 4903 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c694f3-bd40-44e8-b076-e0d20b840329-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.568381 4903 generic.go:334] "Generic (PLEG): container finished" podID="ac51dfe7-7ebc-4296-874e-5669f01d115a" containerID="c3997569e826d69f5e39884db1394ef2592a998b8a9cc79407dc081c1ef0b1d0" exitCode=0 Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.568515 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac51dfe7-7ebc-4296-874e-5669f01d115a","Type":"ContainerDied","Data":"c3997569e826d69f5e39884db1394ef2592a998b8a9cc79407dc081c1ef0b1d0"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.570952 4903 generic.go:334] "Generic (PLEG): container finished" podID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerID="dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7" exitCode=0 Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.571033 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerDied","Data":"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.571046 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.571072 4903 scope.go:117] "RemoveContainer" containerID="dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.571061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14f1d8c-e6c3-450a-a1ad-d580a33db0a4","Type":"ContainerDied","Data":"23339711baba396955d36d2d7cbfc23a2eab0441d71c927971c4c44965bcac43"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.577055 4903 generic.go:334] "Generic (PLEG): container finished" podID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerID="f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc" exitCode=0 Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.577090 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerDied","Data":"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.577114 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c694f3-bd40-44e8-b076-e0d20b840329","Type":"ContainerDied","Data":"9df882503105b9bbe47404bef3fe35b1b88889529fbf655f359cef12b4d38f32"} Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.577179 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.650844 4903 scope.go:117] "RemoveContainer" containerID="16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.655444 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.668694 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.681615 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.682543 4903 scope.go:117] "RemoveContainer" containerID="dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.686027 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7\": container with ID starting with dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7 not found: ID does not exist" containerID="dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.686068 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7"} err="failed to get container status \"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7\": rpc error: code = NotFound desc = could not find container \"dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7\": container with ID starting with dde4200447ec38fad92fcb8b6bd9f1da6d111f7c4ad10fa6d73081e01ebe24d7 not found: ID does not exist" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.686099 4903 scope.go:117] "RemoveContainer" containerID="16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.688638 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d\": container with ID starting with 16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d not found: ID does not exist" containerID="16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.688695 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d"} err="failed to get container status \"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d\": rpc error: code = NotFound desc = could not find container \"16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d\": container with ID starting with 16b461160d699337fe396bf0c0cce3050739fc9f4dc4325cfe971d624cde8c7d not found: ID does not exist" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.688718 4903 scope.go:117] "RemoveContainer" containerID="f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.697702 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.706440 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707016 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ad3f2c-68c0-426e-94b6-999ea0629dcd" containerName="nova-manage" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707038 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ad3f2c-68c0-426e-94b6-999ea0629dcd" containerName="nova-manage" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707059 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-log" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707070 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-log" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707103 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="dnsmasq-dns" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707112 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="dnsmasq-dns" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707133 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="init" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707141 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="init" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707169 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-metadata" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707177 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-metadata" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707189 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-log" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707197 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-log" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.707211 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-api" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707219 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-api" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707450 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-log" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707473 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-log" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707486 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" containerName="nova-metadata-metadata" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707499 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" containerName="nova-api-api" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707517 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa730e48-0dda-48df-9675-d7b3fa3358d1" containerName="dnsmasq-dns" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.707528 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ad3f2c-68c0-426e-94b6-999ea0629dcd" containerName="nova-manage" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.708905 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.712514 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.713425 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.713789 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.724067 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.730108 4903 scope.go:117] "RemoveContainer" containerID="0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.732532 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.734583 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.737893 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.741519 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.743640 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.775858 4903 scope.go:117] "RemoveContainer" containerID="f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.776380 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc\": container with ID starting with f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc not found: ID does not exist" containerID="f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.776427 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc"} err="failed to get container status \"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc\": rpc error: code = NotFound desc = could not find container \"f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc\": container with ID starting with f5774cd37f0d7205107b096b229ca80e9183435ddb77d3f1dc25f6fb16805bbc not found: ID does not exist" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.776453 4903 scope.go:117] "RemoveContainer" containerID="0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287" Dec 02 23:20:08 crc kubenswrapper[4903]: E1202 23:20:08.776843 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287\": container with ID starting with 0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287 not found: ID does not exist" containerID="0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.776874 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287"} err="failed to get container status \"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287\": rpc error: code = NotFound desc = could not find container \"0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287\": container with ID starting with 0ef0e434eba62ba0e8bd98a8131d401532fe92d72365db80170265e16912a287 not found: ID does not exist" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880026 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbwkb\" (UniqueName: \"kubernetes.io/projected/7235be43-b81b-4894-a75b-4c8444482eba-kube-api-access-fbwkb\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880142 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftp6g\" (UniqueName: \"kubernetes.io/projected/a0018a95-dc74-4511-ade4-c77e4846f0a0-kube-api-access-ftp6g\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880191 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-config-data\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880235 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880290 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880327 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0018a95-dc74-4511-ade4-c77e4846f0a0-logs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-public-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880389 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880423 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7235be43-b81b-4894-a75b-4c8444482eba-logs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880452 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.880477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-config-data\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.958398 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.982841 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftp6g\" (UniqueName: \"kubernetes.io/projected/a0018a95-dc74-4511-ade4-c77e4846f0a0-kube-api-access-ftp6g\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.982928 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-config-data\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.982985 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983050 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983092 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0018a95-dc74-4511-ade4-c77e4846f0a0-logs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983121 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-public-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983156 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983192 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7235be43-b81b-4894-a75b-4c8444482eba-logs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983222 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983251 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-config-data\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983306 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbwkb\" (UniqueName: \"kubernetes.io/projected/7235be43-b81b-4894-a75b-4c8444482eba-kube-api-access-fbwkb\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.983718 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0018a95-dc74-4511-ade4-c77e4846f0a0-logs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.984253 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7235be43-b81b-4894-a75b-4c8444482eba-logs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.990281 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-public-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.990336 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.990833 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.991057 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.992305 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.992841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0018a95-dc74-4511-ade4-c77e4846f0a0-config-data\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:08 crc kubenswrapper[4903]: I1202 23:20:08.997110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7235be43-b81b-4894-a75b-4c8444482eba-config-data\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.000318 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftp6g\" (UniqueName: \"kubernetes.io/projected/a0018a95-dc74-4511-ade4-c77e4846f0a0-kube-api-access-ftp6g\") pod \"nova-metadata-0\" (UID: \"a0018a95-dc74-4511-ade4-c77e4846f0a0\") " pod="openstack/nova-metadata-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.011666 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbwkb\" (UniqueName: \"kubernetes.io/projected/7235be43-b81b-4894-a75b-4c8444482eba-kube-api-access-fbwkb\") pod \"nova-api-0\" (UID: \"7235be43-b81b-4894-a75b-4c8444482eba\") " pod="openstack/nova-api-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.043249 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.085282 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8jmx\" (UniqueName: \"kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx\") pod \"ac51dfe7-7ebc-4296-874e-5669f01d115a\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.085544 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle\") pod \"ac51dfe7-7ebc-4296-874e-5669f01d115a\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.085730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data\") pod \"ac51dfe7-7ebc-4296-874e-5669f01d115a\" (UID: \"ac51dfe7-7ebc-4296-874e-5669f01d115a\") " Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.089027 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx" (OuterVolumeSpecName: "kube-api-access-x8jmx") pod "ac51dfe7-7ebc-4296-874e-5669f01d115a" (UID: "ac51dfe7-7ebc-4296-874e-5669f01d115a"). InnerVolumeSpecName "kube-api-access-x8jmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.090135 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.117101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac51dfe7-7ebc-4296-874e-5669f01d115a" (UID: "ac51dfe7-7ebc-4296-874e-5669f01d115a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.130869 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data" (OuterVolumeSpecName: "config-data") pod "ac51dfe7-7ebc-4296-874e-5669f01d115a" (UID: "ac51dfe7-7ebc-4296-874e-5669f01d115a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.188439 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8jmx\" (UniqueName: \"kubernetes.io/projected/ac51dfe7-7ebc-4296-874e-5669f01d115a-kube-api-access-x8jmx\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.188473 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.188487 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac51dfe7-7ebc-4296-874e-5669f01d115a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.531324 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: W1202 23:20:09.532009 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7235be43_b81b_4894_a75b_4c8444482eba.slice/crio-be80402e59462b81f933aed89b86686fa0800a1dc0e0ebc451ecacc382eac276 WatchSource:0}: Error finding container be80402e59462b81f933aed89b86686fa0800a1dc0e0ebc451ecacc382eac276: Status 404 returned error can't find the container with id be80402e59462b81f933aed89b86686fa0800a1dc0e0ebc451ecacc382eac276 Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.601100 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7235be43-b81b-4894-a75b-4c8444482eba","Type":"ContainerStarted","Data":"be80402e59462b81f933aed89b86686fa0800a1dc0e0ebc451ecacc382eac276"} Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.602689 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac51dfe7-7ebc-4296-874e-5669f01d115a","Type":"ContainerDied","Data":"f8614f1aab9662a1e08b6ac2131c4bc0c27234b946da5bc5d184a0b7b3228223"} Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.602741 4903 scope.go:117] "RemoveContainer" containerID="c3997569e826d69f5e39884db1394ef2592a998b8a9cc79407dc081c1ef0b1d0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.602835 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.629584 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14f1d8c-e6c3-450a-a1ad-d580a33db0a4" path="/var/lib/kubelet/pods/c14f1d8c-e6c3-450a-a1ad-d580a33db0a4/volumes" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.634510 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c694f3-bd40-44e8-b076-e0d20b840329" path="/var/lib/kubelet/pods/e5c694f3-bd40-44e8-b076-e0d20b840329/volumes" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.659511 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.692559 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: W1202 23:20:09.695865 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0018a95_dc74_4511_ade4_c77e4846f0a0.slice/crio-76302cd49c72719af4fb5afc8ad40834877524a0393c7b6783c67c4d7e722ca4 WatchSource:0}: Error finding container 76302cd49c72719af4fb5afc8ad40834877524a0393c7b6783c67c4d7e722ca4: Status 404 returned error can't find the container with id 76302cd49c72719af4fb5afc8ad40834877524a0393c7b6783c67c4d7e722ca4 Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.711948 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.723464 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: E1202 23:20:09.724127 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac51dfe7-7ebc-4296-874e-5669f01d115a" containerName="nova-scheduler-scheduler" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.724153 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac51dfe7-7ebc-4296-874e-5669f01d115a" containerName="nova-scheduler-scheduler" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.724506 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac51dfe7-7ebc-4296-874e-5669f01d115a" containerName="nova-scheduler-scheduler" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.725677 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.731625 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.737090 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.802224 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.802576 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gblvc\" (UniqueName: \"kubernetes.io/projected/dfe7a458-659b-465b-8ab9-712e3a865820-kube-api-access-gblvc\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.802746 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-config-data\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.904705 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gblvc\" (UniqueName: \"kubernetes.io/projected/dfe7a458-659b-465b-8ab9-712e3a865820-kube-api-access-gblvc\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.905214 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-config-data\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.905353 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.915035 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-config-data\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.915892 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe7a458-659b-465b-8ab9-712e3a865820-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:09 crc kubenswrapper[4903]: I1202 23:20:09.928547 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gblvc\" (UniqueName: \"kubernetes.io/projected/dfe7a458-659b-465b-8ab9-712e3a865820-kube-api-access-gblvc\") pod \"nova-scheduler-0\" (UID: \"dfe7a458-659b-465b-8ab9-712e3a865820\") " pod="openstack/nova-scheduler-0" Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.057086 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.547062 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.645461 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0018a95-dc74-4511-ade4-c77e4846f0a0","Type":"ContainerStarted","Data":"ba7d52284b55178c4fb652527766d243e226da02a04dd2724054e4b02f568e2e"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.645544 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0018a95-dc74-4511-ade4-c77e4846f0a0","Type":"ContainerStarted","Data":"c62c811c04a5f62f43adb6ea76828f41d4a73c97ff2e67cf74f03c2891dd0aca"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.645575 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0018a95-dc74-4511-ade4-c77e4846f0a0","Type":"ContainerStarted","Data":"76302cd49c72719af4fb5afc8ad40834877524a0393c7b6783c67c4d7e722ca4"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.656115 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7235be43-b81b-4894-a75b-4c8444482eba","Type":"ContainerStarted","Data":"a02cd7b88bcc9df9480ddf46cbc81ad07e40e6892ef2b5dba81424d412cc848f"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.656197 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7235be43-b81b-4894-a75b-4c8444482eba","Type":"ContainerStarted","Data":"4a704cccd13e60cf9030855eb9af1dde7a2d70905655f0c5a6e847cc660a4435"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.658227 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfe7a458-659b-465b-8ab9-712e3a865820","Type":"ContainerStarted","Data":"18e9af5c4f246c52c318527737321edd9ad3e82ff5fbc6c0f6e8f7216b5f96a1"} Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.698648 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.698626487 podStartE2EDuration="2.698626487s" podCreationTimestamp="2025-12-02 23:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:20:10.675488007 +0000 UTC m=+1349.384042300" watchObservedRunningTime="2025-12-02 23:20:10.698626487 +0000 UTC m=+1349.407180780" Dec 02 23:20:10 crc kubenswrapper[4903]: I1202 23:20:10.698993 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6989885449999997 podStartE2EDuration="2.698988545s" podCreationTimestamp="2025-12-02 23:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:20:10.689813694 +0000 UTC m=+1349.398367987" watchObservedRunningTime="2025-12-02 23:20:10.698988545 +0000 UTC m=+1349.407542838" Dec 02 23:20:11 crc kubenswrapper[4903]: I1202 23:20:11.627442 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac51dfe7-7ebc-4296-874e-5669f01d115a" path="/var/lib/kubelet/pods/ac51dfe7-7ebc-4296-874e-5669f01d115a/volumes" Dec 02 23:20:11 crc kubenswrapper[4903]: I1202 23:20:11.670220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfe7a458-659b-465b-8ab9-712e3a865820","Type":"ContainerStarted","Data":"010748afaebe49b13c34b841f4ab741d7b7c5ab7a3a0650f58aef36b3f7b4e13"} Dec 02 23:20:11 crc kubenswrapper[4903]: I1202 23:20:11.700511 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.70048499 podStartE2EDuration="2.70048499s" podCreationTimestamp="2025-12-02 23:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:20:11.693510571 +0000 UTC m=+1350.402064854" watchObservedRunningTime="2025-12-02 23:20:11.70048499 +0000 UTC m=+1350.409039283" Dec 02 23:20:14 crc kubenswrapper[4903]: I1202 23:20:14.090502 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:20:14 crc kubenswrapper[4903]: I1202 23:20:14.091027 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 23:20:15 crc kubenswrapper[4903]: I1202 23:20:15.057435 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 23:20:19 crc kubenswrapper[4903]: I1202 23:20:19.044443 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:20:19 crc kubenswrapper[4903]: I1202 23:20:19.045131 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 23:20:19 crc kubenswrapper[4903]: I1202 23:20:19.090924 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:20:19 crc kubenswrapper[4903]: I1202 23:20:19.091161 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.057549 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.064878 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7235be43-b81b-4894-a75b-4c8444482eba" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.064893 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7235be43-b81b-4894-a75b-4c8444482eba" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.095260 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.107928 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a0018a95-dc74-4511-ade4-c77e4846f0a0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.108329 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a0018a95-dc74-4511-ade4-c77e4846f0a0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 23:20:20 crc kubenswrapper[4903]: I1202 23:20:20.820453 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.070088 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.070401 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.070451 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.071299 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.071368 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a" gracePeriod=600 Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.825072 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a" exitCode=0 Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.825134 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a"} Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.825527 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb"} Dec 02 23:20:23 crc kubenswrapper[4903]: I1202 23:20:23.825558 4903 scope.go:117] "RemoveContainer" containerID="3aacdc4cb2887eb67cb12a1770fb101e2569b94da3ca1d528e9eafde11a8e5a7" Dec 02 23:20:26 crc kubenswrapper[4903]: I1202 23:20:26.846227 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.053778 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.054884 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.066187 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.068065 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.103582 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.109248 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.123174 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.903411 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.993461 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 23:20:29 crc kubenswrapper[4903]: I1202 23:20:29.999914 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 23:20:38 crc kubenswrapper[4903]: I1202 23:20:38.754094 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:39 crc kubenswrapper[4903]: I1202 23:20:39.624335 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:42 crc kubenswrapper[4903]: I1202 23:20:42.018715 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="rabbitmq" containerID="cri-o://9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63" gracePeriod=604797 Dec 02 23:20:42 crc kubenswrapper[4903]: I1202 23:20:42.868761 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="rabbitmq" containerID="cri-o://b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be" gracePeriod=604797 Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.589177 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748027 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748553 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748711 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748770 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748883 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.748972 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.749033 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcvgr\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.749085 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.749142 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.749229 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.749297 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf\") pod \"1743f362-cc56-4c25-a31d-7a78f269f570\" (UID: \"1743f362-cc56-4c25-a31d-7a78f269f570\") " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.750081 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.751240 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.758789 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.761547 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.762571 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.763255 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr" (OuterVolumeSpecName: "kube-api-access-dcvgr") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "kube-api-access-dcvgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.763505 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info" (OuterVolumeSpecName: "pod-info") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.767494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.792858 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data" (OuterVolumeSpecName: "config-data") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851636 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851707 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851717 4903 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1743f362-cc56-4c25-a31d-7a78f269f570-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851726 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcvgr\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-kube-api-access-dcvgr\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851739 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851747 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851757 4903 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1743f362-cc56-4c25-a31d-7a78f269f570-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.851765 4903 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.884432 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.886077 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf" (OuterVolumeSpecName: "server-conf") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.933921 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1743f362-cc56-4c25-a31d-7a78f269f570" (UID: "1743f362-cc56-4c25-a31d-7a78f269f570"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.952995 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.953025 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1743f362-cc56-4c25-a31d-7a78f269f570-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:43 crc kubenswrapper[4903]: I1202 23:20:43.953035 4903 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1743f362-cc56-4c25-a31d-7a78f269f570-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.129393 4903 generic.go:334] "Generic (PLEG): container finished" podID="1743f362-cc56-4c25-a31d-7a78f269f570" containerID="9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63" exitCode=0 Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.129440 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerDied","Data":"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63"} Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.129467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1743f362-cc56-4c25-a31d-7a78f269f570","Type":"ContainerDied","Data":"81ca914cc8757707bf88b5061d4f0283fee1578c60eaf59983c41918a7d4a0dd"} Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.129482 4903 scope.go:117] "RemoveContainer" containerID="9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.129641 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.180178 4903 scope.go:117] "RemoveContainer" containerID="474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.197166 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.203821 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.219471 4903 scope.go:117] "RemoveContainer" containerID="9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.219920 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:44 crc kubenswrapper[4903]: E1202 23:20:44.220335 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="rabbitmq" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.220353 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="rabbitmq" Dec 02 23:20:44 crc kubenswrapper[4903]: E1202 23:20:44.220380 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="setup-container" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.220387 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="setup-container" Dec 02 23:20:44 crc kubenswrapper[4903]: E1202 23:20:44.220403 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63\": container with ID starting with 9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63 not found: ID does not exist" containerID="9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.220432 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63"} err="failed to get container status \"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63\": rpc error: code = NotFound desc = could not find container \"9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63\": container with ID starting with 9b91e61570f202fd2ee2c7c0c68382ef2a9304589bdb5be1b280ca88289d7b63 not found: ID does not exist" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.220451 4903 scope.go:117] "RemoveContainer" containerID="474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.220586 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" containerName="rabbitmq" Dec 02 23:20:44 crc kubenswrapper[4903]: E1202 23:20:44.221454 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1\": container with ID starting with 474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1 not found: ID does not exist" containerID="474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.221502 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1"} err="failed to get container status \"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1\": rpc error: code = NotFound desc = could not find container \"474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1\": container with ID starting with 474d15e8fbcfcc157cc2499a2900124e9055f517115b690a5ff37221baaeeef1 not found: ID does not exist" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.222270 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.226935 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.229956 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.232518 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.232837 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.232943 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.233178 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6h5p2" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.234883 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.236508 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.380737 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381025 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-config-data\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381055 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381087 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381127 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96f2e452-05fe-45c6-940b-5a53959af002-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381152 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381182 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc66g\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-kube-api-access-cc66g\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381201 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381240 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381276 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96f2e452-05fe-45c6-940b-5a53959af002-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.381309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483549 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483620 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483712 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96f2e452-05fe-45c6-940b-5a53959af002-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483752 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483797 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc66g\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-kube-api-access-cc66g\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483944 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96f2e452-05fe-45c6-940b-5a53959af002-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.483982 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.484026 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.484069 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-config-data\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.484080 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.484429 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.484943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-config-data\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.485129 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.486074 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.487163 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96f2e452-05fe-45c6-940b-5a53959af002-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.488788 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.490647 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.491058 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96f2e452-05fe-45c6-940b-5a53959af002-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.495205 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96f2e452-05fe-45c6-940b-5a53959af002-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.504545 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc66g\" (UniqueName: \"kubernetes.io/projected/96f2e452-05fe-45c6-940b-5a53959af002-kube-api-access-cc66g\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.567596 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"96f2e452-05fe-45c6-940b-5a53959af002\") " pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.632560 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.797977 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798106 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798159 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798209 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798265 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpzrq\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798313 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798368 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798429 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798470 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798506 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798533 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf\") pod \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\" (UID: \"3afcfb6b-f7ce-424a-be67-3ef69a367fdb\") " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.798793 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.799164 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.800263 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.802698 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.803003 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.803211 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.808000 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq" (OuterVolumeSpecName: "kube-api-access-wpzrq") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "kube-api-access-wpzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.811882 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.811979 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info" (OuterVolumeSpecName: "pod-info") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.839295 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.843637 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data" (OuterVolumeSpecName: "config-data") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.862378 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf" (OuterVolumeSpecName: "server-conf") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900484 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpzrq\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-kube-api-access-wpzrq\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900514 4903 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900524 4903 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900534 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900543 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900550 4903 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900557 4903 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900605 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.900614 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.920765 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 23:20:44 crc kubenswrapper[4903]: I1202 23:20:44.948037 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3afcfb6b-f7ce-424a-be67-3ef69a367fdb" (UID: "3afcfb6b-f7ce-424a-be67-3ef69a367fdb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.002682 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.002724 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afcfb6b-f7ce-424a-be67-3ef69a367fdb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.142581 4903 generic.go:334] "Generic (PLEG): container finished" podID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerID="b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be" exitCode=0 Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.142640 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.142674 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerDied","Data":"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be"} Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.142835 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afcfb6b-f7ce-424a-be67-3ef69a367fdb","Type":"ContainerDied","Data":"a02a93d29467d0aef12905fadb2d077c02f88559bbd115e1471a4ef683fcfbfa"} Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.142869 4903 scope.go:117] "RemoveContainer" containerID="b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.169085 4903 scope.go:117] "RemoveContainer" containerID="904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.198958 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.216056 4903 scope.go:117] "RemoveContainer" containerID="b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be" Dec 02 23:20:45 crc kubenswrapper[4903]: E1202 23:20:45.217144 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be\": container with ID starting with b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be not found: ID does not exist" containerID="b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.217186 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be"} err="failed to get container status \"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be\": rpc error: code = NotFound desc = could not find container \"b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be\": container with ID starting with b52ebe9146c4378e8ec57e9c80e0d8b8f1dd2eab2895018f02feec819c9625be not found: ID does not exist" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.217210 4903 scope.go:117] "RemoveContainer" containerID="904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45" Dec 02 23:20:45 crc kubenswrapper[4903]: E1202 23:20:45.217619 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45\": container with ID starting with 904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45 not found: ID does not exist" containerID="904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.217690 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45"} err="failed to get container status \"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45\": rpc error: code = NotFound desc = could not find container \"904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45\": container with ID starting with 904179dd08ee62104cd87da2ebd6abc096ad6bbc2f5839cc47a59381afe08e45 not found: ID does not exist" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.230142 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.243392 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:45 crc kubenswrapper[4903]: E1202 23:20:45.243751 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="setup-container" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.243762 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="setup-container" Dec 02 23:20:45 crc kubenswrapper[4903]: E1202 23:20:45.243779 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="rabbitmq" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.243784 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="rabbitmq" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.243975 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" containerName="rabbitmq" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.245091 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.245167 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.247513 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.247791 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.247956 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.248145 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.248757 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.248862 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x7h96" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.250671 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.314557 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 23:20:45 crc kubenswrapper[4903]: W1202 23:20:45.317123 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96f2e452_05fe_45c6_940b_5a53959af002.slice/crio-df0cc8af2becdb19771cdc50621d6f06e0e3a4085e12602b345ca43ad5769ac7 WatchSource:0}: Error finding container df0cc8af2becdb19771cdc50621d6f06e0e3a4085e12602b345ca43ad5769ac7: Status 404 returned error can't find the container with id df0cc8af2becdb19771cdc50621d6f06e0e3a4085e12602b345ca43ad5769ac7 Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410015 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46968896-fe5c-4bf2-a304-51f818ae9cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410068 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410127 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410169 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410212 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9rr\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-kube-api-access-9n9rr\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410267 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410421 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410535 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.410598 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46968896-fe5c-4bf2-a304-51f818ae9cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512486 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512538 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46968896-fe5c-4bf2-a304-51f818ae9cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512742 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46968896-fe5c-4bf2-a304-51f818ae9cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512759 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512790 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512806 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9rr\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-kube-api-access-9n9rr\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.512896 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.513599 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.513772 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.513853 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.513970 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.514832 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.517169 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46968896-fe5c-4bf2-a304-51f818ae9cc5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.518355 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.523520 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46968896-fe5c-4bf2-a304-51f818ae9cc5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.525221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46968896-fe5c-4bf2-a304-51f818ae9cc5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.526819 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.534535 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9rr\" (UniqueName: \"kubernetes.io/projected/46968896-fe5c-4bf2-a304-51f818ae9cc5-kube-api-access-9n9rr\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.547639 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46968896-fe5c-4bf2-a304-51f818ae9cc5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.638779 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1743f362-cc56-4c25-a31d-7a78f269f570" path="/var/lib/kubelet/pods/1743f362-cc56-4c25-a31d-7a78f269f570/volumes" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.641300 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afcfb6b-f7ce-424a-be67-3ef69a367fdb" path="/var/lib/kubelet/pods/3afcfb6b-f7ce-424a-be67-3ef69a367fdb/volumes" Dec 02 23:20:45 crc kubenswrapper[4903]: I1202 23:20:45.743281 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:20:46 crc kubenswrapper[4903]: I1202 23:20:46.167735 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96f2e452-05fe-45c6-940b-5a53959af002","Type":"ContainerStarted","Data":"df0cc8af2becdb19771cdc50621d6f06e0e3a4085e12602b345ca43ad5769ac7"} Dec 02 23:20:46 crc kubenswrapper[4903]: W1202 23:20:46.207835 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46968896_fe5c_4bf2_a304_51f818ae9cc5.slice/crio-69949c4d6be32831d0225ce06aa17a42b3c49922f12f3c86ee694849060445e0 WatchSource:0}: Error finding container 69949c4d6be32831d0225ce06aa17a42b3c49922f12f3c86ee694849060445e0: Status 404 returned error can't find the container with id 69949c4d6be32831d0225ce06aa17a42b3c49922f12f3c86ee694849060445e0 Dec 02 23:20:46 crc kubenswrapper[4903]: I1202 23:20:46.212538 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 23:20:47 crc kubenswrapper[4903]: I1202 23:20:47.177988 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96f2e452-05fe-45c6-940b-5a53959af002","Type":"ContainerStarted","Data":"a937fc27e5a9a6cbe1416357733e915eb00576c6ef08c2c169fce9109702be56"} Dec 02 23:20:47 crc kubenswrapper[4903]: I1202 23:20:47.180376 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46968896-fe5c-4bf2-a304-51f818ae9cc5","Type":"ContainerStarted","Data":"69949c4d6be32831d0225ce06aa17a42b3c49922f12f3c86ee694849060445e0"} Dec 02 23:20:49 crc kubenswrapper[4903]: I1202 23:20:49.208791 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46968896-fe5c-4bf2-a304-51f818ae9cc5","Type":"ContainerStarted","Data":"91b4f2c091775c9f7a256cd4f9a3c19b4029d2dc66cdbe3f3595404182fb9a79"} Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.022493 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.025938 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.028750 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.045178 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086714 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwrw\" (UniqueName: \"kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086779 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086825 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086862 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086903 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086952 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.086980 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.188568 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwrw\" (UniqueName: \"kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.188844 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.188956 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.189070 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.189283 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.190223 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.191031 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.190114 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.190131 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.189737 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.190976 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.190052 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.191804 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.214541 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwrw\" (UniqueName: \"kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw\") pod \"dnsmasq-dns-b759b9cd7-7bhs7\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.388335 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:52 crc kubenswrapper[4903]: I1202 23:20:52.857782 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:20:52 crc kubenswrapper[4903]: W1202 23:20:52.861974 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8a1d84_b79a_4cbd_8bc6_f077f6f9a7e6.slice/crio-f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a WatchSource:0}: Error finding container f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a: Status 404 returned error can't find the container with id f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a Dec 02 23:20:53 crc kubenswrapper[4903]: I1202 23:20:53.262055 4903 generic.go:334] "Generic (PLEG): container finished" podID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerID="d01c168e227c05710487f7f276542a0866b6bf62f0b03840df6f4017561e84e2" exitCode=0 Dec 02 23:20:53 crc kubenswrapper[4903]: I1202 23:20:53.262209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" event={"ID":"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6","Type":"ContainerDied","Data":"d01c168e227c05710487f7f276542a0866b6bf62f0b03840df6f4017561e84e2"} Dec 02 23:20:53 crc kubenswrapper[4903]: I1202 23:20:53.262350 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" event={"ID":"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6","Type":"ContainerStarted","Data":"f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a"} Dec 02 23:20:54 crc kubenswrapper[4903]: I1202 23:20:54.275489 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" event={"ID":"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6","Type":"ContainerStarted","Data":"e9ea438e24ec640f3d149d889d46ed9462572372648a7ece103ebb81299d3930"} Dec 02 23:20:54 crc kubenswrapper[4903]: I1202 23:20:54.275981 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:20:54 crc kubenswrapper[4903]: I1202 23:20:54.313474 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" podStartSLOduration=2.313454811 podStartE2EDuration="2.313454811s" podCreationTimestamp="2025-12-02 23:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:20:54.304497825 +0000 UTC m=+1393.013052108" watchObservedRunningTime="2025-12-02 23:20:54.313454811 +0000 UTC m=+1393.022009094" Dec 02 23:21:01 crc kubenswrapper[4903]: I1202 23:21:01.071219 4903 scope.go:117] "RemoveContainer" containerID="d04b4c2f88b4b14dbd2841fdde15eed0a268d87c661116dbd9409f753cb7f7b7" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.389993 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.462023 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.462368 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="dnsmasq-dns" containerID="cri-o://5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957" gracePeriod=10 Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.611775 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bf48746b9-mb6br"] Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.613642 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.664698 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bf48746b9-mb6br"] Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729049 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729391 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-svc\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729415 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729458 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-config\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7xr\" (UniqueName: \"kubernetes.io/projected/609de84d-e5af-4d50-8852-655e6bbb30b9-kube-api-access-5z7xr\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729495 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.729547 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831476 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-svc\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831521 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831595 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-config\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831613 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7xr\" (UniqueName: \"kubernetes.io/projected/609de84d-e5af-4d50-8852-655e6bbb30b9-kube-api-access-5z7xr\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831661 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831784 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.831877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.832988 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.835423 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.835548 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.835565 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.836037 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-config\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.836286 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609de84d-e5af-4d50-8852-655e6bbb30b9-dns-svc\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.853223 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7xr\" (UniqueName: \"kubernetes.io/projected/609de84d-e5af-4d50-8852-655e6bbb30b9-kube-api-access-5z7xr\") pod \"dnsmasq-dns-7bf48746b9-mb6br\" (UID: \"609de84d-e5af-4d50-8852-655e6bbb30b9\") " pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:02 crc kubenswrapper[4903]: I1202 23:21:02.953900 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.099007 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239210 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrv2r\" (UniqueName: \"kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239368 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239392 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239422 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239498 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.239520 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb\") pod \"066404b9-1803-4886-95b3-b5d9f850f388\" (UID: \"066404b9-1803-4886-95b3-b5d9f850f388\") " Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.244822 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r" (OuterVolumeSpecName: "kube-api-access-nrv2r") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "kube-api-access-nrv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.295463 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.295773 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.301174 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config" (OuterVolumeSpecName: "config") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.302371 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.308886 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "066404b9-1803-4886-95b3-b5d9f850f388" (UID: "066404b9-1803-4886-95b3-b5d9f850f388"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342878 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrv2r\" (UniqueName: \"kubernetes.io/projected/066404b9-1803-4886-95b3-b5d9f850f388-kube-api-access-nrv2r\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342929 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342940 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342948 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342956 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.342970 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/066404b9-1803-4886-95b3-b5d9f850f388-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.374749 4903 generic.go:334] "Generic (PLEG): container finished" podID="066404b9-1803-4886-95b3-b5d9f850f388" containerID="5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957" exitCode=0 Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.374803 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" event={"ID":"066404b9-1803-4886-95b3-b5d9f850f388","Type":"ContainerDied","Data":"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957"} Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.374872 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" event={"ID":"066404b9-1803-4886-95b3-b5d9f850f388","Type":"ContainerDied","Data":"8aef4671ff4233acaf75334c1a83a05c577874afa973c3e0c78e4eb1f177dafe"} Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.374902 4903 scope.go:117] "RemoveContainer" containerID="5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.375116 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f6dc465-fs5zz" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.417099 4903 scope.go:117] "RemoveContainer" containerID="1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.422474 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:21:03 crc kubenswrapper[4903]: W1202 23:21:03.425986 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609de84d_e5af_4d50_8852_655e6bbb30b9.slice/crio-2592b3955f445653bba2824ea58361491c9eecac232482efc482de0d22c49a42 WatchSource:0}: Error finding container 2592b3955f445653bba2824ea58361491c9eecac232482efc482de0d22c49a42: Status 404 returned error can't find the container with id 2592b3955f445653bba2824ea58361491c9eecac232482efc482de0d22c49a42 Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.434824 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bf48746b9-mb6br"] Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.463375 4903 scope.go:117] "RemoveContainer" containerID="5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957" Dec 02 23:21:03 crc kubenswrapper[4903]: E1202 23:21:03.470005 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957\": container with ID starting with 5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957 not found: ID does not exist" containerID="5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.470053 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957"} err="failed to get container status \"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957\": rpc error: code = NotFound desc = could not find container \"5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957\": container with ID starting with 5755bfa8a05b4b545eff64c9ccb2cb6e7aac7554045573e02022585dd9b31957 not found: ID does not exist" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.470085 4903 scope.go:117] "RemoveContainer" containerID="1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56" Dec 02 23:21:03 crc kubenswrapper[4903]: E1202 23:21:03.470792 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56\": container with ID starting with 1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56 not found: ID does not exist" containerID="1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.470840 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56"} err="failed to get container status \"1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56\": rpc error: code = NotFound desc = could not find container \"1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56\": container with ID starting with 1a7034f44732d5fd714f74f5f9296c58735a0e9b57bdbda254649a10545aea56 not found: ID does not exist" Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.492627 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f6dc465-fs5zz"] Dec 02 23:21:03 crc kubenswrapper[4903]: I1202 23:21:03.635297 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066404b9-1803-4886-95b3-b5d9f850f388" path="/var/lib/kubelet/pods/066404b9-1803-4886-95b3-b5d9f850f388/volumes" Dec 02 23:21:04 crc kubenswrapper[4903]: I1202 23:21:04.391576 4903 generic.go:334] "Generic (PLEG): container finished" podID="609de84d-e5af-4d50-8852-655e6bbb30b9" containerID="bcd2fa00b2a7d981429a961aa7f4a79cba4bf5364c7a4495802549934fdc506d" exitCode=0 Dec 02 23:21:04 crc kubenswrapper[4903]: I1202 23:21:04.391626 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" event={"ID":"609de84d-e5af-4d50-8852-655e6bbb30b9","Type":"ContainerDied","Data":"bcd2fa00b2a7d981429a961aa7f4a79cba4bf5364c7a4495802549934fdc506d"} Dec 02 23:21:04 crc kubenswrapper[4903]: I1202 23:21:04.391675 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" event={"ID":"609de84d-e5af-4d50-8852-655e6bbb30b9","Type":"ContainerStarted","Data":"2592b3955f445653bba2824ea58361491c9eecac232482efc482de0d22c49a42"} Dec 02 23:21:05 crc kubenswrapper[4903]: I1202 23:21:05.406170 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" event={"ID":"609de84d-e5af-4d50-8852-655e6bbb30b9","Type":"ContainerStarted","Data":"6026042f73b161dc3a42fe5c70fee0d2bb573280cddb0e00ca1bf8d46a540c91"} Dec 02 23:21:05 crc kubenswrapper[4903]: I1202 23:21:05.406905 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:05 crc kubenswrapper[4903]: I1202 23:21:05.444777 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" podStartSLOduration=3.444758406 podStartE2EDuration="3.444758406s" podCreationTimestamp="2025-12-02 23:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:21:05.438553785 +0000 UTC m=+1404.147108108" watchObservedRunningTime="2025-12-02 23:21:05.444758406 +0000 UTC m=+1404.153312689" Dec 02 23:21:12 crc kubenswrapper[4903]: I1202 23:21:12.954942 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bf48746b9-mb6br" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.077213 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.077513 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="dnsmasq-dns" containerID="cri-o://e9ea438e24ec640f3d149d889d46ed9462572372648a7ece103ebb81299d3930" gracePeriod=10 Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.515740 4903 generic.go:334] "Generic (PLEG): container finished" podID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerID="e9ea438e24ec640f3d149d889d46ed9462572372648a7ece103ebb81299d3930" exitCode=0 Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.516084 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" event={"ID":"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6","Type":"ContainerDied","Data":"e9ea438e24ec640f3d149d889d46ed9462572372648a7ece103ebb81299d3930"} Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.516277 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" event={"ID":"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6","Type":"ContainerDied","Data":"f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a"} Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.516294 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f14f013611fdc4d0c3c4078df7927712b6667286ae16bceff7eaaa77beb0c05a" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.584956 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.709942 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710046 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710104 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710145 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710287 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710374 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlwrw\" (UniqueName: \"kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.710401 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb\") pod \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\" (UID: \"6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6\") " Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.755021 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw" (OuterVolumeSpecName: "kube-api-access-jlwrw") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "kube-api-access-jlwrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.803472 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.812369 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.812382 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.812457 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlwrw\" (UniqueName: \"kubernetes.io/projected/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-kube-api-access-jlwrw\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.817166 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.817989 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.822477 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config" (OuterVolumeSpecName: "config") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.841262 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" (UID: "6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.914853 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.914895 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.914909 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.914918 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:13 crc kubenswrapper[4903]: I1202 23:21:13.914927 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:14 crc kubenswrapper[4903]: I1202 23:21:14.527012 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b759b9cd7-7bhs7" Dec 02 23:21:14 crc kubenswrapper[4903]: I1202 23:21:14.581587 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:21:14 crc kubenswrapper[4903]: I1202 23:21:14.607946 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b759b9cd7-7bhs7"] Dec 02 23:21:15 crc kubenswrapper[4903]: I1202 23:21:15.635566 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" path="/var/lib/kubelet/pods/6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6/volumes" Dec 02 23:21:20 crc kubenswrapper[4903]: I1202 23:21:20.621810 4903 generic.go:334] "Generic (PLEG): container finished" podID="96f2e452-05fe-45c6-940b-5a53959af002" containerID="a937fc27e5a9a6cbe1416357733e915eb00576c6ef08c2c169fce9109702be56" exitCode=0 Dec 02 23:21:20 crc kubenswrapper[4903]: I1202 23:21:20.621911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96f2e452-05fe-45c6-940b-5a53959af002","Type":"ContainerDied","Data":"a937fc27e5a9a6cbe1416357733e915eb00576c6ef08c2c169fce9109702be56"} Dec 02 23:21:21 crc kubenswrapper[4903]: I1202 23:21:21.638555 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96f2e452-05fe-45c6-940b-5a53959af002","Type":"ContainerStarted","Data":"388bf3c844c82389cc6589823bccf209c9b82f1e3c4f868b0f88c09b77c2ced5"} Dec 02 23:21:21 crc kubenswrapper[4903]: I1202 23:21:21.639322 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 23:21:21 crc kubenswrapper[4903]: I1202 23:21:21.641443 4903 generic.go:334] "Generic (PLEG): container finished" podID="46968896-fe5c-4bf2-a304-51f818ae9cc5" containerID="91b4f2c091775c9f7a256cd4f9a3c19b4029d2dc66cdbe3f3595404182fb9a79" exitCode=0 Dec 02 23:21:21 crc kubenswrapper[4903]: I1202 23:21:21.641523 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46968896-fe5c-4bf2-a304-51f818ae9cc5","Type":"ContainerDied","Data":"91b4f2c091775c9f7a256cd4f9a3c19b4029d2dc66cdbe3f3595404182fb9a79"} Dec 02 23:21:21 crc kubenswrapper[4903]: I1202 23:21:21.676368 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.676345564 podStartE2EDuration="37.676345564s" podCreationTimestamp="2025-12-02 23:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:21:21.673537466 +0000 UTC m=+1420.382091839" watchObservedRunningTime="2025-12-02 23:21:21.676345564 +0000 UTC m=+1420.384899857" Dec 02 23:21:22 crc kubenswrapper[4903]: I1202 23:21:22.656213 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46968896-fe5c-4bf2-a304-51f818ae9cc5","Type":"ContainerStarted","Data":"348414a256d0fab5826c2ef226ae2e09b08e8fa1166ed8da467e77b286780fd6"} Dec 02 23:21:22 crc kubenswrapper[4903]: I1202 23:21:22.657746 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:21:22 crc kubenswrapper[4903]: I1202 23:21:22.698890 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.698871737 podStartE2EDuration="37.698871737s" podCreationTimestamp="2025-12-02 23:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:21:22.694984503 +0000 UTC m=+1421.403538796" watchObservedRunningTime="2025-12-02 23:21:22.698871737 +0000 UTC m=+1421.407426020" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.253278 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf"] Dec 02 23:21:31 crc kubenswrapper[4903]: E1202 23:21:31.254170 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="init" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254183 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="init" Dec 02 23:21:31 crc kubenswrapper[4903]: E1202 23:21:31.254201 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="init" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254206 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="init" Dec 02 23:21:31 crc kubenswrapper[4903]: E1202 23:21:31.254217 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254223 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: E1202 23:21:31.254234 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254239 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254422 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="066404b9-1803-4886-95b3-b5d9f850f388" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.254433 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8a1d84-b79a-4cbd-8bc6-f077f6f9a7e6" containerName="dnsmasq-dns" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.255036 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.258176 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.258353 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.260376 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.261864 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.267371 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf"] Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.382020 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.382153 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdn6l\" (UniqueName: \"kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.382216 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.382267 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.483717 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.483808 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.484603 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.484734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdn6l\" (UniqueName: \"kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.489733 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.494189 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.494563 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.505481 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdn6l\" (UniqueName: \"kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:31 crc kubenswrapper[4903]: I1202 23:21:31.636025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:32 crc kubenswrapper[4903]: I1202 23:21:32.344174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf"] Dec 02 23:21:32 crc kubenswrapper[4903]: I1202 23:21:32.753988 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" event={"ID":"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3","Type":"ContainerStarted","Data":"14463cf1818a2d073b673e6b700ec749fc56500152dae2ecee54a45e5cfeb84a"} Dec 02 23:21:34 crc kubenswrapper[4903]: I1202 23:21:34.842119 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="96f2e452-05fe-45c6-940b-5a53959af002" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.223:5671: connect: connection refused" Dec 02 23:21:35 crc kubenswrapper[4903]: I1202 23:21:35.748278 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 23:21:42 crc kubenswrapper[4903]: I1202 23:21:42.156767 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:21:42 crc kubenswrapper[4903]: I1202 23:21:42.897125 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" event={"ID":"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3","Type":"ContainerStarted","Data":"6679fde156576f953c839c970f4d86ff60a3976c174b5146e51ca909250ae6c4"} Dec 02 23:21:42 crc kubenswrapper[4903]: I1202 23:21:42.925572 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" podStartSLOduration=2.099442671 podStartE2EDuration="11.925552018s" podCreationTimestamp="2025-12-02 23:21:31 +0000 UTC" firstStartedPulling="2025-12-02 23:21:32.327611157 +0000 UTC m=+1431.036165430" lastFinishedPulling="2025-12-02 23:21:42.153720474 +0000 UTC m=+1440.862274777" observedRunningTime="2025-12-02 23:21:42.918448716 +0000 UTC m=+1441.627003009" watchObservedRunningTime="2025-12-02 23:21:42.925552018 +0000 UTC m=+1441.634106311" Dec 02 23:21:44 crc kubenswrapper[4903]: I1202 23:21:44.841986 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 23:21:54 crc kubenswrapper[4903]: I1202 23:21:54.057113 4903 generic.go:334] "Generic (PLEG): container finished" podID="d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" containerID="6679fde156576f953c839c970f4d86ff60a3976c174b5146e51ca909250ae6c4" exitCode=0 Dec 02 23:21:54 crc kubenswrapper[4903]: I1202 23:21:54.057197 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" event={"ID":"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3","Type":"ContainerDied","Data":"6679fde156576f953c839c970f4d86ff60a3976c174b5146e51ca909250ae6c4"} Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.669165 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.828699 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdn6l\" (UniqueName: \"kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l\") pod \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.828838 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory\") pod \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.828981 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle\") pod \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.829059 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key\") pod \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\" (UID: \"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3\") " Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.836570 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l" (OuterVolumeSpecName: "kube-api-access-qdn6l") pod "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" (UID: "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3"). InnerVolumeSpecName "kube-api-access-qdn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.837010 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" (UID: "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.866885 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" (UID: "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.894127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory" (OuterVolumeSpecName: "inventory") pod "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" (UID: "d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.931285 4903 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.931318 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.931328 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdn6l\" (UniqueName: \"kubernetes.io/projected/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-kube-api-access-qdn6l\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:55 crc kubenswrapper[4903]: I1202 23:21:55.931337 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.084136 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" event={"ID":"d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3","Type":"ContainerDied","Data":"14463cf1818a2d073b673e6b700ec749fc56500152dae2ecee54a45e5cfeb84a"} Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.084180 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14463cf1818a2d073b673e6b700ec749fc56500152dae2ecee54a45e5cfeb84a" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.084229 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.191912 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz"] Dec 02 23:21:56 crc kubenswrapper[4903]: E1202 23:21:56.192406 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.192429 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.192612 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.193234 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.195566 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.195811 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.196069 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.196343 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.215619 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz"] Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.342764 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdf7\" (UniqueName: \"kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.342846 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.343161 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.445918 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.446146 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdf7\" (UniqueName: \"kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.446287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.451910 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.453370 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.475271 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdf7\" (UniqueName: \"kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-crjgz\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:56 crc kubenswrapper[4903]: I1202 23:21:56.511502 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:21:57 crc kubenswrapper[4903]: I1202 23:21:57.096411 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz"] Dec 02 23:21:58 crc kubenswrapper[4903]: I1202 23:21:58.110592 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" event={"ID":"b5c4ae7e-90d7-4090-9357-77e09a38d4f6","Type":"ContainerStarted","Data":"72d017013823e69a3df4e769d55eb1d6588d961c8049cacc2b72e78564c9ccb5"} Dec 02 23:21:58 crc kubenswrapper[4903]: I1202 23:21:58.110958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" event={"ID":"b5c4ae7e-90d7-4090-9357-77e09a38d4f6","Type":"ContainerStarted","Data":"b2b79d007f52c79b38b3c6f049c020a7faf0f91393e3277108bd1ec81f115681"} Dec 02 23:21:58 crc kubenswrapper[4903]: I1202 23:21:58.133995 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" podStartSLOduration=1.719780363 podStartE2EDuration="2.133963459s" podCreationTimestamp="2025-12-02 23:21:56 +0000 UTC" firstStartedPulling="2025-12-02 23:21:57.113048275 +0000 UTC m=+1455.821602598" lastFinishedPulling="2025-12-02 23:21:57.527231401 +0000 UTC m=+1456.235785694" observedRunningTime="2025-12-02 23:21:58.126229591 +0000 UTC m=+1456.834783874" watchObservedRunningTime="2025-12-02 23:21:58.133963459 +0000 UTC m=+1456.842517742" Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.157866 4903 generic.go:334] "Generic (PLEG): container finished" podID="b5c4ae7e-90d7-4090-9357-77e09a38d4f6" containerID="72d017013823e69a3df4e769d55eb1d6588d961c8049cacc2b72e78564c9ccb5" exitCode=0 Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.157959 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" event={"ID":"b5c4ae7e-90d7-4090-9357-77e09a38d4f6","Type":"ContainerDied","Data":"72d017013823e69a3df4e769d55eb1d6588d961c8049cacc2b72e78564c9ccb5"} Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.230341 4903 scope.go:117] "RemoveContainer" containerID="c05fac882c1e44a20dc8568975b81f635728484946b4852a3f4c993c9f60acca" Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.280337 4903 scope.go:117] "RemoveContainer" containerID="b60c0804d0b56b47cfb1bf0f283521aa6346766a6e8d5087480304c445ab63cb" Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.318168 4903 scope.go:117] "RemoveContainer" containerID="549f03176e1c9cd6246023d31eea53beaf5978f428985bb4496afa494a359b03" Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.340721 4903 scope.go:117] "RemoveContainer" containerID="37db9d225f60dc79e872e93e4627d2e3ef7e6cd467c048b8d56b9402afbdeac1" Dec 02 23:22:01 crc kubenswrapper[4903]: I1202 23:22:01.366340 4903 scope.go:117] "RemoveContainer" containerID="505f1f8a030d5d6e67eda39f677de2f9d0b7674e20f4223fe6103b4dd6833d1b" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.687310 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.781979 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key\") pod \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.782060 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdf7\" (UniqueName: \"kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7\") pod \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.782111 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory\") pod \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\" (UID: \"b5c4ae7e-90d7-4090-9357-77e09a38d4f6\") " Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.803674 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7" (OuterVolumeSpecName: "kube-api-access-vqdf7") pod "b5c4ae7e-90d7-4090-9357-77e09a38d4f6" (UID: "b5c4ae7e-90d7-4090-9357-77e09a38d4f6"). InnerVolumeSpecName "kube-api-access-vqdf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.821064 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory" (OuterVolumeSpecName: "inventory") pod "b5c4ae7e-90d7-4090-9357-77e09a38d4f6" (UID: "b5c4ae7e-90d7-4090-9357-77e09a38d4f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.826608 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b5c4ae7e-90d7-4090-9357-77e09a38d4f6" (UID: "b5c4ae7e-90d7-4090-9357-77e09a38d4f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.884074 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.884111 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdf7\" (UniqueName: \"kubernetes.io/projected/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-kube-api-access-vqdf7\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:02 crc kubenswrapper[4903]: I1202 23:22:02.884127 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5c4ae7e-90d7-4090-9357-77e09a38d4f6-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.181636 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" event={"ID":"b5c4ae7e-90d7-4090-9357-77e09a38d4f6","Type":"ContainerDied","Data":"b2b79d007f52c79b38b3c6f049c020a7faf0f91393e3277108bd1ec81f115681"} Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.181721 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2b79d007f52c79b38b3c6f049c020a7faf0f91393e3277108bd1ec81f115681" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.181738 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-crjgz" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.258447 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc"] Dec 02 23:22:03 crc kubenswrapper[4903]: E1202 23:22:03.258891 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c4ae7e-90d7-4090-9357-77e09a38d4f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.258909 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c4ae7e-90d7-4090-9357-77e09a38d4f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.259100 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c4ae7e-90d7-4090-9357-77e09a38d4f6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.259781 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.262172 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.262625 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.262836 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.262947 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.282594 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc"] Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.394581 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.394625 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.394787 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzl5\" (UniqueName: \"kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.394842 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.496277 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.496336 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.496423 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzl5\" (UniqueName: \"kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.496475 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.500323 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.500608 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.509165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.513110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzl5\" (UniqueName: \"kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:03 crc kubenswrapper[4903]: I1202 23:22:03.577885 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:22:04 crc kubenswrapper[4903]: I1202 23:22:04.124434 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc"] Dec 02 23:22:04 crc kubenswrapper[4903]: W1202 23:22:04.132747 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod270d4936_772f_40a2_8da3_f2651a216d6b.slice/crio-fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d WatchSource:0}: Error finding container fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d: Status 404 returned error can't find the container with id fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d Dec 02 23:22:04 crc kubenswrapper[4903]: I1202 23:22:04.197463 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" event={"ID":"270d4936-772f-40a2-8da3-f2651a216d6b","Type":"ContainerStarted","Data":"fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d"} Dec 02 23:22:05 crc kubenswrapper[4903]: I1202 23:22:05.208820 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" event={"ID":"270d4936-772f-40a2-8da3-f2651a216d6b","Type":"ContainerStarted","Data":"fd8e3fb1fa6bdbe59d5e9f9e25f94dd03d2225972b359063bda6b716051b51a6"} Dec 02 23:22:05 crc kubenswrapper[4903]: I1202 23:22:05.243199 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" podStartSLOduration=1.8005202900000001 podStartE2EDuration="2.243180895s" podCreationTimestamp="2025-12-02 23:22:03 +0000 UTC" firstStartedPulling="2025-12-02 23:22:04.136798633 +0000 UTC m=+1462.845352956" lastFinishedPulling="2025-12-02 23:22:04.579459248 +0000 UTC m=+1463.288013561" observedRunningTime="2025-12-02 23:22:05.227031245 +0000 UTC m=+1463.935585538" watchObservedRunningTime="2025-12-02 23:22:05.243180895 +0000 UTC m=+1463.951735178" Dec 02 23:22:23 crc kubenswrapper[4903]: I1202 23:22:23.069553 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:22:23 crc kubenswrapper[4903]: I1202 23:22:23.070418 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.828397 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.834315 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.859903 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.902498 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm6wl\" (UniqueName: \"kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.902702 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:25 crc kubenswrapper[4903]: I1202 23:22:25.902733 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.003943 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm6wl\" (UniqueName: \"kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.004075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.004131 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.004555 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.004865 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.025612 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm6wl\" (UniqueName: \"kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl\") pod \"redhat-operators-r6hp8\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.168434 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:26 crc kubenswrapper[4903]: I1202 23:22:26.628143 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:26 crc kubenswrapper[4903]: W1202 23:22:26.635051 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f326ef_b524_442e_bf69_46e7dc184926.slice/crio-e93ae1f7cbe1f4c98a68f4a2009dfaecf81dcc05c21a73f63324f9dd332cb2c2 WatchSource:0}: Error finding container e93ae1f7cbe1f4c98a68f4a2009dfaecf81dcc05c21a73f63324f9dd332cb2c2: Status 404 returned error can't find the container with id e93ae1f7cbe1f4c98a68f4a2009dfaecf81dcc05c21a73f63324f9dd332cb2c2 Dec 02 23:22:27 crc kubenswrapper[4903]: I1202 23:22:27.495752 4903 generic.go:334] "Generic (PLEG): container finished" podID="32f326ef-b524-442e-bf69-46e7dc184926" containerID="53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632" exitCode=0 Dec 02 23:22:27 crc kubenswrapper[4903]: I1202 23:22:27.495857 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerDied","Data":"53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632"} Dec 02 23:22:27 crc kubenswrapper[4903]: I1202 23:22:27.496184 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerStarted","Data":"e93ae1f7cbe1f4c98a68f4a2009dfaecf81dcc05c21a73f63324f9dd332cb2c2"} Dec 02 23:22:28 crc kubenswrapper[4903]: I1202 23:22:28.512827 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerStarted","Data":"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e"} Dec 02 23:22:31 crc kubenswrapper[4903]: I1202 23:22:31.549045 4903 generic.go:334] "Generic (PLEG): container finished" podID="32f326ef-b524-442e-bf69-46e7dc184926" containerID="4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e" exitCode=0 Dec 02 23:22:31 crc kubenswrapper[4903]: I1202 23:22:31.549118 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerDied","Data":"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e"} Dec 02 23:22:32 crc kubenswrapper[4903]: I1202 23:22:32.560295 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerStarted","Data":"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b"} Dec 02 23:22:32 crc kubenswrapper[4903]: I1202 23:22:32.583587 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6hp8" podStartSLOduration=2.904018325 podStartE2EDuration="7.583568021s" podCreationTimestamp="2025-12-02 23:22:25 +0000 UTC" firstStartedPulling="2025-12-02 23:22:27.498690991 +0000 UTC m=+1486.207245274" lastFinishedPulling="2025-12-02 23:22:32.178240687 +0000 UTC m=+1490.886794970" observedRunningTime="2025-12-02 23:22:32.582191738 +0000 UTC m=+1491.290746021" watchObservedRunningTime="2025-12-02 23:22:32.583568021 +0000 UTC m=+1491.292122304" Dec 02 23:22:36 crc kubenswrapper[4903]: I1202 23:22:36.168692 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:36 crc kubenswrapper[4903]: I1202 23:22:36.168929 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:37 crc kubenswrapper[4903]: I1202 23:22:37.235808 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r6hp8" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="registry-server" probeResult="failure" output=< Dec 02 23:22:37 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:22:37 crc kubenswrapper[4903]: > Dec 02 23:22:46 crc kubenswrapper[4903]: I1202 23:22:46.252167 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:46 crc kubenswrapper[4903]: I1202 23:22:46.336525 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:46 crc kubenswrapper[4903]: I1202 23:22:46.495011 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:47 crc kubenswrapper[4903]: I1202 23:22:47.713528 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6hp8" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="registry-server" containerID="cri-o://f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b" gracePeriod=2 Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.235891 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.351074 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content\") pod \"32f326ef-b524-442e-bf69-46e7dc184926\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.351141 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities\") pod \"32f326ef-b524-442e-bf69-46e7dc184926\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.351288 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm6wl\" (UniqueName: \"kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl\") pod \"32f326ef-b524-442e-bf69-46e7dc184926\" (UID: \"32f326ef-b524-442e-bf69-46e7dc184926\") " Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.352378 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities" (OuterVolumeSpecName: "utilities") pod "32f326ef-b524-442e-bf69-46e7dc184926" (UID: "32f326ef-b524-442e-bf69-46e7dc184926"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.356840 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl" (OuterVolumeSpecName: "kube-api-access-qm6wl") pod "32f326ef-b524-442e-bf69-46e7dc184926" (UID: "32f326ef-b524-442e-bf69-46e7dc184926"). InnerVolumeSpecName "kube-api-access-qm6wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.453661 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm6wl\" (UniqueName: \"kubernetes.io/projected/32f326ef-b524-442e-bf69-46e7dc184926-kube-api-access-qm6wl\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.453714 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.500350 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32f326ef-b524-442e-bf69-46e7dc184926" (UID: "32f326ef-b524-442e-bf69-46e7dc184926"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.555719 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32f326ef-b524-442e-bf69-46e7dc184926-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.730021 4903 generic.go:334] "Generic (PLEG): container finished" podID="32f326ef-b524-442e-bf69-46e7dc184926" containerID="f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b" exitCode=0 Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.730084 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerDied","Data":"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b"} Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.730131 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6hp8" event={"ID":"32f326ef-b524-442e-bf69-46e7dc184926","Type":"ContainerDied","Data":"e93ae1f7cbe1f4c98a68f4a2009dfaecf81dcc05c21a73f63324f9dd332cb2c2"} Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.730160 4903 scope.go:117] "RemoveContainer" containerID="f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.730159 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6hp8" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.769786 4903 scope.go:117] "RemoveContainer" containerID="4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.801172 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.816034 4903 scope.go:117] "RemoveContainer" containerID="53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.822222 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6hp8"] Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.865753 4903 scope.go:117] "RemoveContainer" containerID="f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b" Dec 02 23:22:48 crc kubenswrapper[4903]: E1202 23:22:48.866618 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b\": container with ID starting with f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b not found: ID does not exist" containerID="f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.866734 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b"} err="failed to get container status \"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b\": rpc error: code = NotFound desc = could not find container \"f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b\": container with ID starting with f9381713b9211d2d200031310cdf06a8be2e685a06fdb024dc08cbc83690411b not found: ID does not exist" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.866818 4903 scope.go:117] "RemoveContainer" containerID="4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e" Dec 02 23:22:48 crc kubenswrapper[4903]: E1202 23:22:48.867325 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e\": container with ID starting with 4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e not found: ID does not exist" containerID="4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.867374 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e"} err="failed to get container status \"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e\": rpc error: code = NotFound desc = could not find container \"4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e\": container with ID starting with 4dc833f87b1a92d677bbcc59f0ea34ce390c9217a88bae6bb597848c519b584e not found: ID does not exist" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.867435 4903 scope.go:117] "RemoveContainer" containerID="53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632" Dec 02 23:22:48 crc kubenswrapper[4903]: E1202 23:22:48.867988 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632\": container with ID starting with 53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632 not found: ID does not exist" containerID="53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632" Dec 02 23:22:48 crc kubenswrapper[4903]: I1202 23:22:48.868071 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632"} err="failed to get container status \"53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632\": rpc error: code = NotFound desc = could not find container \"53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632\": container with ID starting with 53c258742e8d6922a44164fe49f6238c758a0002618b631b6cd1721c5c925632 not found: ID does not exist" Dec 02 23:22:49 crc kubenswrapper[4903]: I1202 23:22:49.635141 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f326ef-b524-442e-bf69-46e7dc184926" path="/var/lib/kubelet/pods/32f326ef-b524-442e-bf69-46e7dc184926/volumes" Dec 02 23:22:53 crc kubenswrapper[4903]: I1202 23:22:53.070253 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:22:53 crc kubenswrapper[4903]: I1202 23:22:53.070896 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:23:01 crc kubenswrapper[4903]: I1202 23:23:01.468481 4903 scope.go:117] "RemoveContainer" containerID="1e1fe13234c90a5746f503e83c23d95610a319b063ef9da865233ab6b18b3b04" Dec 02 23:23:01 crc kubenswrapper[4903]: I1202 23:23:01.528920 4903 scope.go:117] "RemoveContainer" containerID="ec86184f533730ef740f956a3a55cdef5876deb56101395eba447b01c1c43ce9" Dec 02 23:23:01 crc kubenswrapper[4903]: I1202 23:23:01.564049 4903 scope.go:117] "RemoveContainer" containerID="9bc1786dafda95c7a91b9d71ba47a1486b6b698cb2827edeca4b558247096a06" Dec 02 23:23:23 crc kubenswrapper[4903]: I1202 23:23:23.070448 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:23:23 crc kubenswrapper[4903]: I1202 23:23:23.071297 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:23:23 crc kubenswrapper[4903]: I1202 23:23:23.071379 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:23:23 crc kubenswrapper[4903]: I1202 23:23:23.072794 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:23:23 crc kubenswrapper[4903]: I1202 23:23:23.072969 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" gracePeriod=600 Dec 02 23:23:23 crc kubenswrapper[4903]: E1202 23:23:23.199711 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:23:24 crc kubenswrapper[4903]: I1202 23:23:24.176756 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" exitCode=0 Dec 02 23:23:24 crc kubenswrapper[4903]: I1202 23:23:24.176882 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb"} Dec 02 23:23:24 crc kubenswrapper[4903]: I1202 23:23:24.177176 4903 scope.go:117] "RemoveContainer" containerID="74d6b77c13dd80086b2c813f620edffd3efb1418a840277ffcbe0978aaf6798a" Dec 02 23:23:24 crc kubenswrapper[4903]: I1202 23:23:24.177751 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:23:24 crc kubenswrapper[4903]: E1202 23:23:24.178200 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.178585 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:26 crc kubenswrapper[4903]: E1202 23:23:26.179401 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="registry-server" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.179419 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="registry-server" Dec 02 23:23:26 crc kubenswrapper[4903]: E1202 23:23:26.179441 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="extract-content" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.179450 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="extract-content" Dec 02 23:23:26 crc kubenswrapper[4903]: E1202 23:23:26.179468 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="extract-utilities" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.179479 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="extract-utilities" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.180433 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f326ef-b524-442e-bf69-46e7dc184926" containerName="registry-server" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.183490 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.213445 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.321804 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.322130 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.322368 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w887n\" (UniqueName: \"kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.424939 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.425100 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w887n\" (UniqueName: \"kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.425245 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.425573 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.425702 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.447407 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w887n\" (UniqueName: \"kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n\") pod \"redhat-marketplace-pw9wj\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:26 crc kubenswrapper[4903]: I1202 23:23:26.521837 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:27 crc kubenswrapper[4903]: I1202 23:23:27.062976 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:27 crc kubenswrapper[4903]: I1202 23:23:27.230759 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerStarted","Data":"7292379ad8c094b32433f74695ac43721125b8506ee79a7f828b4d4fe177fa56"} Dec 02 23:23:28 crc kubenswrapper[4903]: I1202 23:23:28.276432 4903 generic.go:334] "Generic (PLEG): container finished" podID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerID="7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b" exitCode=0 Dec 02 23:23:28 crc kubenswrapper[4903]: I1202 23:23:28.276615 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerDied","Data":"7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b"} Dec 02 23:23:29 crc kubenswrapper[4903]: I1202 23:23:29.294503 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerStarted","Data":"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390"} Dec 02 23:23:30 crc kubenswrapper[4903]: I1202 23:23:30.336607 4903 generic.go:334] "Generic (PLEG): container finished" podID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerID="d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390" exitCode=0 Dec 02 23:23:30 crc kubenswrapper[4903]: I1202 23:23:30.336738 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerDied","Data":"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390"} Dec 02 23:23:31 crc kubenswrapper[4903]: I1202 23:23:31.354668 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerStarted","Data":"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d"} Dec 02 23:23:35 crc kubenswrapper[4903]: I1202 23:23:35.613707 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:23:35 crc kubenswrapper[4903]: E1202 23:23:35.614494 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:23:36 crc kubenswrapper[4903]: I1202 23:23:36.522753 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:36 crc kubenswrapper[4903]: I1202 23:23:36.523215 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:36 crc kubenswrapper[4903]: I1202 23:23:36.583621 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:36 crc kubenswrapper[4903]: I1202 23:23:36.605123 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pw9wj" podStartSLOduration=8.022036147 podStartE2EDuration="10.605102811s" podCreationTimestamp="2025-12-02 23:23:26 +0000 UTC" firstStartedPulling="2025-12-02 23:23:28.28497958 +0000 UTC m=+1546.993533903" lastFinishedPulling="2025-12-02 23:23:30.868046254 +0000 UTC m=+1549.576600567" observedRunningTime="2025-12-02 23:23:31.378809469 +0000 UTC m=+1550.087363802" watchObservedRunningTime="2025-12-02 23:23:36.605102811 +0000 UTC m=+1555.313657104" Dec 02 23:23:37 crc kubenswrapper[4903]: I1202 23:23:37.509965 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:37 crc kubenswrapper[4903]: I1202 23:23:37.578788 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.467326 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pw9wj" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="registry-server" containerID="cri-o://72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d" gracePeriod=2 Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.830546 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.833286 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.850986 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.943869 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.944363 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qfr\" (UniqueName: \"kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:39 crc kubenswrapper[4903]: I1202 23:23:39.944437 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.046283 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.046347 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.046462 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54qfr\" (UniqueName: \"kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.046997 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.047061 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.065475 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qfr\" (UniqueName: \"kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr\") pod \"community-operators-fclvs\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.155989 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.161589 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.250687 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities\") pod \"aecf00f9-823d-4cb0-a4b8-4e62df713014\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.250756 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w887n\" (UniqueName: \"kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n\") pod \"aecf00f9-823d-4cb0-a4b8-4e62df713014\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.250781 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content\") pod \"aecf00f9-823d-4cb0-a4b8-4e62df713014\" (UID: \"aecf00f9-823d-4cb0-a4b8-4e62df713014\") " Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.256853 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n" (OuterVolumeSpecName: "kube-api-access-w887n") pod "aecf00f9-823d-4cb0-a4b8-4e62df713014" (UID: "aecf00f9-823d-4cb0-a4b8-4e62df713014"). InnerVolumeSpecName "kube-api-access-w887n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.257241 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities" (OuterVolumeSpecName: "utilities") pod "aecf00f9-823d-4cb0-a4b8-4e62df713014" (UID: "aecf00f9-823d-4cb0-a4b8-4e62df713014"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.274198 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aecf00f9-823d-4cb0-a4b8-4e62df713014" (UID: "aecf00f9-823d-4cb0-a4b8-4e62df713014"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.353819 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.354116 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w887n\" (UniqueName: \"kubernetes.io/projected/aecf00f9-823d-4cb0-a4b8-4e62df713014-kube-api-access-w887n\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.354127 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aecf00f9-823d-4cb0-a4b8-4e62df713014-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.479155 4903 generic.go:334] "Generic (PLEG): container finished" podID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerID="72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d" exitCode=0 Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.479202 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw9wj" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.479201 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerDied","Data":"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d"} Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.479374 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw9wj" event={"ID":"aecf00f9-823d-4cb0-a4b8-4e62df713014","Type":"ContainerDied","Data":"7292379ad8c094b32433f74695ac43721125b8506ee79a7f828b4d4fe177fa56"} Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.479424 4903 scope.go:117] "RemoveContainer" containerID="72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.507915 4903 scope.go:117] "RemoveContainer" containerID="d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.514965 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.524443 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw9wj"] Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.531559 4903 scope.go:117] "RemoveContainer" containerID="7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.564201 4903 scope.go:117] "RemoveContainer" containerID="72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d" Dec 02 23:23:40 crc kubenswrapper[4903]: E1202 23:23:40.565283 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d\": container with ID starting with 72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d not found: ID does not exist" containerID="72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.565336 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d"} err="failed to get container status \"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d\": rpc error: code = NotFound desc = could not find container \"72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d\": container with ID starting with 72e788838c05ab111a3e53779605e53ec4c2e3eb559df7340b47bd478cd1595d not found: ID does not exist" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.565365 4903 scope.go:117] "RemoveContainer" containerID="d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390" Dec 02 23:23:40 crc kubenswrapper[4903]: E1202 23:23:40.565709 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390\": container with ID starting with d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390 not found: ID does not exist" containerID="d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.565735 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390"} err="failed to get container status \"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390\": rpc error: code = NotFound desc = could not find container \"d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390\": container with ID starting with d39381f19adeea1e02567d822532d1f4100c7d623f4a76d9e45ef80aeecab390 not found: ID does not exist" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.565750 4903 scope.go:117] "RemoveContainer" containerID="7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b" Dec 02 23:23:40 crc kubenswrapper[4903]: E1202 23:23:40.566115 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b\": container with ID starting with 7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b not found: ID does not exist" containerID="7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.566143 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b"} err="failed to get container status \"7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b\": rpc error: code = NotFound desc = could not find container \"7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b\": container with ID starting with 7cbe5f766816b1d2132c3d4ad60dfa0cdb197baf2d8e6edcf9c8128392cf2b9b not found: ID does not exist" Dec 02 23:23:40 crc kubenswrapper[4903]: I1202 23:23:40.704529 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:41 crc kubenswrapper[4903]: I1202 23:23:41.495825 4903 generic.go:334] "Generic (PLEG): container finished" podID="db16a343-d9d6-4ea3-8426-fedba8348792" containerID="15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92" exitCode=0 Dec 02 23:23:41 crc kubenswrapper[4903]: I1202 23:23:41.495912 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerDied","Data":"15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92"} Dec 02 23:23:41 crc kubenswrapper[4903]: I1202 23:23:41.495944 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerStarted","Data":"a98696290e2f12296a38b951e0670bbc92fb5834f4abd6d0c9fe6be71df2bfb0"} Dec 02 23:23:41 crc kubenswrapper[4903]: I1202 23:23:41.499750 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:23:41 crc kubenswrapper[4903]: I1202 23:23:41.631386 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" path="/var/lib/kubelet/pods/aecf00f9-823d-4cb0-a4b8-4e62df713014/volumes" Dec 02 23:23:42 crc kubenswrapper[4903]: I1202 23:23:42.512096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerStarted","Data":"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876"} Dec 02 23:23:43 crc kubenswrapper[4903]: I1202 23:23:43.528805 4903 generic.go:334] "Generic (PLEG): container finished" podID="db16a343-d9d6-4ea3-8426-fedba8348792" containerID="0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876" exitCode=0 Dec 02 23:23:43 crc kubenswrapper[4903]: I1202 23:23:43.528851 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerDied","Data":"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876"} Dec 02 23:23:44 crc kubenswrapper[4903]: I1202 23:23:44.541957 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerStarted","Data":"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73"} Dec 02 23:23:44 crc kubenswrapper[4903]: I1202 23:23:44.583699 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fclvs" podStartSLOduration=3.146795854 podStartE2EDuration="5.58364661s" podCreationTimestamp="2025-12-02 23:23:39 +0000 UTC" firstStartedPulling="2025-12-02 23:23:41.499456744 +0000 UTC m=+1560.208011047" lastFinishedPulling="2025-12-02 23:23:43.93630751 +0000 UTC m=+1562.644861803" observedRunningTime="2025-12-02 23:23:44.568261507 +0000 UTC m=+1563.276815810" watchObservedRunningTime="2025-12-02 23:23:44.58364661 +0000 UTC m=+1563.292200923" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.162609 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.163252 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.215827 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.612676 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:23:50 crc kubenswrapper[4903]: E1202 23:23:50.612970 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.678895 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:50 crc kubenswrapper[4903]: I1202 23:23:50.747732 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:52 crc kubenswrapper[4903]: I1202 23:23:52.623974 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fclvs" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="registry-server" containerID="cri-o://c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73" gracePeriod=2 Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.166982 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.229723 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content\") pod \"db16a343-d9d6-4ea3-8426-fedba8348792\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.229815 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities\") pod \"db16a343-d9d6-4ea3-8426-fedba8348792\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.229974 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54qfr\" (UniqueName: \"kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr\") pod \"db16a343-d9d6-4ea3-8426-fedba8348792\" (UID: \"db16a343-d9d6-4ea3-8426-fedba8348792\") " Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.230803 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities" (OuterVolumeSpecName: "utilities") pod "db16a343-d9d6-4ea3-8426-fedba8348792" (UID: "db16a343-d9d6-4ea3-8426-fedba8348792"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.231045 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.235471 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr" (OuterVolumeSpecName: "kube-api-access-54qfr") pod "db16a343-d9d6-4ea3-8426-fedba8348792" (UID: "db16a343-d9d6-4ea3-8426-fedba8348792"). InnerVolumeSpecName "kube-api-access-54qfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.281175 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db16a343-d9d6-4ea3-8426-fedba8348792" (UID: "db16a343-d9d6-4ea3-8426-fedba8348792"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.333987 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54qfr\" (UniqueName: \"kubernetes.io/projected/db16a343-d9d6-4ea3-8426-fedba8348792-kube-api-access-54qfr\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.334226 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db16a343-d9d6-4ea3-8426-fedba8348792-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.635562 4903 generic.go:334] "Generic (PLEG): container finished" podID="db16a343-d9d6-4ea3-8426-fedba8348792" containerID="c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73" exitCode=0 Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.635621 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerDied","Data":"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73"} Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.635635 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fclvs" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.636583 4903 scope.go:117] "RemoveContainer" containerID="c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.636507 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fclvs" event={"ID":"db16a343-d9d6-4ea3-8426-fedba8348792","Type":"ContainerDied","Data":"a98696290e2f12296a38b951e0670bbc92fb5834f4abd6d0c9fe6be71df2bfb0"} Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.668987 4903 scope.go:117] "RemoveContainer" containerID="0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.675846 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.687283 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fclvs"] Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.716623 4903 scope.go:117] "RemoveContainer" containerID="15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.749843 4903 scope.go:117] "RemoveContainer" containerID="c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73" Dec 02 23:23:53 crc kubenswrapper[4903]: E1202 23:23:53.750309 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73\": container with ID starting with c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73 not found: ID does not exist" containerID="c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.750359 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73"} err="failed to get container status \"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73\": rpc error: code = NotFound desc = could not find container \"c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73\": container with ID starting with c3ac257bb636002800e4446ac107ae80e78ef2c0fc06d3adb590785aeca0ef73 not found: ID does not exist" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.750393 4903 scope.go:117] "RemoveContainer" containerID="0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876" Dec 02 23:23:53 crc kubenswrapper[4903]: E1202 23:23:53.750955 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876\": container with ID starting with 0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876 not found: ID does not exist" containerID="0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.750990 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876"} err="failed to get container status \"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876\": rpc error: code = NotFound desc = could not find container \"0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876\": container with ID starting with 0645e57004247de2605c50e3123e03e33f3c915e9e600b070be91dd04bf1e876 not found: ID does not exist" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.751013 4903 scope.go:117] "RemoveContainer" containerID="15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92" Dec 02 23:23:53 crc kubenswrapper[4903]: E1202 23:23:53.751398 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92\": container with ID starting with 15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92 not found: ID does not exist" containerID="15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92" Dec 02 23:23:53 crc kubenswrapper[4903]: I1202 23:23:53.751435 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92"} err="failed to get container status \"15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92\": rpc error: code = NotFound desc = could not find container \"15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92\": container with ID starting with 15560292546df604e2c5733ae0d7a29ac6abe42a0124d42f97052ac2aa62aa92 not found: ID does not exist" Dec 02 23:23:55 crc kubenswrapper[4903]: I1202 23:23:55.633253 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" path="/var/lib/kubelet/pods/db16a343-d9d6-4ea3-8426-fedba8348792/volumes" Dec 02 23:24:04 crc kubenswrapper[4903]: I1202 23:24:04.612779 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:24:04 crc kubenswrapper[4903]: E1202 23:24:04.613710 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:24:17 crc kubenswrapper[4903]: I1202 23:24:17.612417 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:24:17 crc kubenswrapper[4903]: E1202 23:24:17.613191 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:24:31 crc kubenswrapper[4903]: I1202 23:24:31.837000 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:24:31 crc kubenswrapper[4903]: E1202 23:24:31.839371 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:24:43 crc kubenswrapper[4903]: I1202 23:24:43.613250 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:24:43 crc kubenswrapper[4903]: E1202 23:24:43.615486 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:24:55 crc kubenswrapper[4903]: I1202 23:24:55.612848 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:24:55 crc kubenswrapper[4903]: E1202 23:24:55.613729 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:25:08 crc kubenswrapper[4903]: I1202 23:25:08.612343 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:25:08 crc kubenswrapper[4903]: E1202 23:25:08.613149 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:25:18 crc kubenswrapper[4903]: I1202 23:25:18.628106 4903 generic.go:334] "Generic (PLEG): container finished" podID="270d4936-772f-40a2-8da3-f2651a216d6b" containerID="fd8e3fb1fa6bdbe59d5e9f9e25f94dd03d2225972b359063bda6b716051b51a6" exitCode=0 Dec 02 23:25:18 crc kubenswrapper[4903]: I1202 23:25:18.628345 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" event={"ID":"270d4936-772f-40a2-8da3-f2651a216d6b","Type":"ContainerDied","Data":"fd8e3fb1fa6bdbe59d5e9f9e25f94dd03d2225972b359063bda6b716051b51a6"} Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.138113 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.324161 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzl5\" (UniqueName: \"kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5\") pod \"270d4936-772f-40a2-8da3-f2651a216d6b\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.325027 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle\") pod \"270d4936-772f-40a2-8da3-f2651a216d6b\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.325206 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory\") pod \"270d4936-772f-40a2-8da3-f2651a216d6b\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.325264 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key\") pod \"270d4936-772f-40a2-8da3-f2651a216d6b\" (UID: \"270d4936-772f-40a2-8da3-f2651a216d6b\") " Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.332686 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "270d4936-772f-40a2-8da3-f2651a216d6b" (UID: "270d4936-772f-40a2-8da3-f2651a216d6b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.336100 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5" (OuterVolumeSpecName: "kube-api-access-6kzl5") pod "270d4936-772f-40a2-8da3-f2651a216d6b" (UID: "270d4936-772f-40a2-8da3-f2651a216d6b"). InnerVolumeSpecName "kube-api-access-6kzl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.364132 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory" (OuterVolumeSpecName: "inventory") pod "270d4936-772f-40a2-8da3-f2651a216d6b" (UID: "270d4936-772f-40a2-8da3-f2651a216d6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.365769 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "270d4936-772f-40a2-8da3-f2651a216d6b" (UID: "270d4936-772f-40a2-8da3-f2651a216d6b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.430425 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.430466 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kzl5\" (UniqueName: \"kubernetes.io/projected/270d4936-772f-40a2-8da3-f2651a216d6b-kube-api-access-6kzl5\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.430479 4903 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.430488 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/270d4936-772f-40a2-8da3-f2651a216d6b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.651409 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" event={"ID":"270d4936-772f-40a2-8da3-f2651a216d6b","Type":"ContainerDied","Data":"fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d"} Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.651456 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fae1fdf90f254e1240150b87d6a53b194cf2c9c1356da85d1f3699a85976ed6d" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.651483 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.786419 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7"] Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804240 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270d4936-772f-40a2-8da3-f2651a216d6b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804291 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="270d4936-772f-40a2-8da3-f2651a216d6b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804337 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="extract-utilities" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804348 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="extract-utilities" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804365 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="extract-content" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804374 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="extract-content" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804387 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804396 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804415 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="extract-content" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804425 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="extract-content" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804447 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804455 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: E1202 23:25:20.804488 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="extract-utilities" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.804499 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="extract-utilities" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.805309 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecf00f9-823d-4cb0-a4b8-4e62df713014" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.805351 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="270d4936-772f-40a2-8da3-f2651a216d6b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.805379 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="db16a343-d9d6-4ea3-8426-fedba8348792" containerName="registry-server" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.814708 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.818679 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.819062 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.819102 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.819213 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.838274 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7"] Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.939732 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjh5\" (UniqueName: \"kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.939940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:20 crc kubenswrapper[4903]: I1202 23:25:20.940348 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.041706 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.041774 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjh5\" (UniqueName: \"kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.041834 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.045902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.049522 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.058379 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjh5\" (UniqueName: \"kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.141381 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.623992 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:25:21 crc kubenswrapper[4903]: E1202 23:25:21.624328 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:25:21 crc kubenswrapper[4903]: I1202 23:25:21.752642 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7"] Dec 02 23:25:22 crc kubenswrapper[4903]: I1202 23:25:22.677750 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" event={"ID":"8dff062c-2479-4ea6-994e-fea352cdf518","Type":"ContainerStarted","Data":"1844919c1da03475a8d60c160e902be69f720407730c0ae05c1fa811423508f9"} Dec 02 23:25:22 crc kubenswrapper[4903]: I1202 23:25:22.678118 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" event={"ID":"8dff062c-2479-4ea6-994e-fea352cdf518","Type":"ContainerStarted","Data":"784b7f025507517e768c0f8df346ce6644e759cab1536605912fea93480df2d7"} Dec 02 23:25:22 crc kubenswrapper[4903]: I1202 23:25:22.703641 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" podStartSLOduration=2.222712296 podStartE2EDuration="2.703622284s" podCreationTimestamp="2025-12-02 23:25:20 +0000 UTC" firstStartedPulling="2025-12-02 23:25:21.748186304 +0000 UTC m=+1660.456740587" lastFinishedPulling="2025-12-02 23:25:22.229096282 +0000 UTC m=+1660.937650575" observedRunningTime="2025-12-02 23:25:22.697523147 +0000 UTC m=+1661.406077450" watchObservedRunningTime="2025-12-02 23:25:22.703622284 +0000 UTC m=+1661.412176567" Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.068191 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-mck9h"] Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.078308 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-6742-account-create-update-8snbz"] Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.087584 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-mck9h"] Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.095643 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-6742-account-create-update-8snbz"] Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.634272 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435a1343-d272-407f-9329-a6d1f481a22a" path="/var/lib/kubelet/pods/435a1343-d272-407f-9329-a6d1f481a22a/volumes" Dec 02 23:25:25 crc kubenswrapper[4903]: I1202 23:25:25.636788 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebba9302-4b1e-4073-83f4-505b43e2309c" path="/var/lib/kubelet/pods/ebba9302-4b1e-4073-83f4-505b43e2309c/volumes" Dec 02 23:25:29 crc kubenswrapper[4903]: I1202 23:25:29.036336 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e73c-account-create-update-twcq7"] Dec 02 23:25:29 crc kubenswrapper[4903]: I1202 23:25:29.050451 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e73c-account-create-update-twcq7"] Dec 02 23:25:29 crc kubenswrapper[4903]: I1202 23:25:29.631365 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d56e27-883f-492d-bb5a-ddf83ea2c78e" path="/var/lib/kubelet/pods/02d56e27-883f-492d-bb5a-ddf83ea2c78e/volumes" Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.049473 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-843c-account-create-update-kmlgq"] Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.068858 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mzmcv"] Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.079743 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-843c-account-create-update-kmlgq"] Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.088039 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mzmcv"] Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.097199 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s22jn"] Dec 02 23:25:30 crc kubenswrapper[4903]: I1202 23:25:30.105780 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s22jn"] Dec 02 23:25:31 crc kubenswrapper[4903]: I1202 23:25:31.638458 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255fbd82-365f-4670-81d4-c173abf6c67b" path="/var/lib/kubelet/pods/255fbd82-365f-4670-81d4-c173abf6c67b/volumes" Dec 02 23:25:31 crc kubenswrapper[4903]: I1202 23:25:31.639745 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afc5a61-8a6e-46a3-b593-7b26bcfa855e" path="/var/lib/kubelet/pods/3afc5a61-8a6e-46a3-b593-7b26bcfa855e/volumes" Dec 02 23:25:31 crc kubenswrapper[4903]: I1202 23:25:31.640896 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8653f5e9-f817-4962-a725-5acc5a161f29" path="/var/lib/kubelet/pods/8653f5e9-f817-4962-a725-5acc5a161f29/volumes" Dec 02 23:25:32 crc kubenswrapper[4903]: I1202 23:25:32.613339 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:25:32 crc kubenswrapper[4903]: E1202 23:25:32.613629 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:25:45 crc kubenswrapper[4903]: I1202 23:25:45.613281 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:25:45 crc kubenswrapper[4903]: E1202 23:25:45.614801 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:25:57 crc kubenswrapper[4903]: I1202 23:25:57.614454 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:25:57 crc kubenswrapper[4903]: E1202 23:25:57.617210 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:26:01 crc kubenswrapper[4903]: I1202 23:26:01.056799 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c12e-account-create-update-jtpzh"] Dec 02 23:26:01 crc kubenswrapper[4903]: I1202 23:26:01.071979 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c12e-account-create-update-jtpzh"] Dec 02 23:26:01 crc kubenswrapper[4903]: I1202 23:26:01.646606 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b051f79-dd17-4446-8316-8de5216d958f" path="/var/lib/kubelet/pods/1b051f79-dd17-4446-8316-8de5216d958f/volumes" Dec 02 23:26:01 crc kubenswrapper[4903]: I1202 23:26:01.934999 4903 scope.go:117] "RemoveContainer" containerID="1c45e13c204868184c4147582782723d0539383f7e1969913b4e4ae26849f05f" Dec 02 23:26:01 crc kubenswrapper[4903]: I1202 23:26:01.969667 4903 scope.go:117] "RemoveContainer" containerID="c21ada723047edb2d29285b3ab0e52418e1b17d40ed45803ee1d7bea6945eaa9" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.001025 4903 scope.go:117] "RemoveContainer" containerID="39799a47c43f342a80ba54c49df78b1316f477952a28bcb11161c5ecb996c541" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.034210 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5jpq8"] Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.050927 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5jpq8"] Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.055746 4903 scope.go:117] "RemoveContainer" containerID="f0be7d97cf17faf4250f3dac4cf157bafcbe8291fe459bbb3e3629a37c1f3b80" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.075885 4903 scope.go:117] "RemoveContainer" containerID="bc102bce8b7e6e0204c98f8462ed2631b352ac6bbfab1b9e754bcf308c08885f" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.122833 4903 scope.go:117] "RemoveContainer" containerID="f0f79efabf05b2c52cf1849f53aedfa9f23a87ddab1faf71a8b1e847c3b42862" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.181716 4903 scope.go:117] "RemoveContainer" containerID="e1ca970452c8cabb1d309d39879d7d7b5e54a2c7bdb1068231e23bf1771b321e" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.231056 4903 scope.go:117] "RemoveContainer" containerID="2c417e13ece4ddd066f0bbe8eacb021511a8d8f6d51ba126f6f651eb2dbdb85d" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.262086 4903 scope.go:117] "RemoveContainer" containerID="dbaaa350be87bf9a255c33156d8f6c761507fdc3f77a2883b6a3beb0d68c9e47" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.303922 4903 scope.go:117] "RemoveContainer" containerID="d2545e57af6266debf47fc4fcd1f2dae0a1d473c62ce84334fc9d25f7dd4a4c7" Dec 02 23:26:02 crc kubenswrapper[4903]: I1202 23:26:02.329679 4903 scope.go:117] "RemoveContainer" containerID="136b4132531bbed69fd27f195113f8c40a08026447ce5fc23cbfa6d7cad84e03" Dec 02 23:26:03 crc kubenswrapper[4903]: I1202 23:26:03.626343 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965" path="/var/lib/kubelet/pods/d9bc1f3f-1d84-4d5f-9e8f-bfec26bcd965/volumes" Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.058318 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mhrvb"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.073632 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9c87-account-create-update-n69dm"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.089636 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gxh6v"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.101157 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mhrvb"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.110937 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9c87-account-create-update-n69dm"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.119786 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gxh6v"] Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.629937 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f1bc13-509f-4bcb-85bd-3af265b8ef01" path="/var/lib/kubelet/pods/15f1bc13-509f-4bcb-85bd-3af265b8ef01/volumes" Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.632489 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769c6602-40ba-4f02-8f65-47ea4be08be4" path="/var/lib/kubelet/pods/769c6602-40ba-4f02-8f65-47ea4be08be4/volumes" Dec 02 23:26:05 crc kubenswrapper[4903]: I1202 23:26:05.634141 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b405c566-af8d-4196-ad1a-5a0dcc450e81" path="/var/lib/kubelet/pods/b405c566-af8d-4196-ad1a-5a0dcc450e81/volumes" Dec 02 23:26:06 crc kubenswrapper[4903]: I1202 23:26:06.031211 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4eb0-account-create-update-rhh8z"] Dec 02 23:26:06 crc kubenswrapper[4903]: I1202 23:26:06.041099 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4eb0-account-create-update-rhh8z"] Dec 02 23:26:07 crc kubenswrapper[4903]: I1202 23:26:07.624454 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9158c65-e3fe-4db4-9329-790edac952f1" path="/var/lib/kubelet/pods/b9158c65-e3fe-4db4-9329-790edac952f1/volumes" Dec 02 23:26:08 crc kubenswrapper[4903]: I1202 23:26:08.036465 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-trbs9"] Dec 02 23:26:08 crc kubenswrapper[4903]: I1202 23:26:08.045011 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4154-account-create-update-6zrtz"] Dec 02 23:26:08 crc kubenswrapper[4903]: I1202 23:26:08.055060 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4154-account-create-update-6zrtz"] Dec 02 23:26:08 crc kubenswrapper[4903]: I1202 23:26:08.064339 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-trbs9"] Dec 02 23:26:09 crc kubenswrapper[4903]: I1202 23:26:09.632528 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca7a411-286a-4af2-bb00-fda2b3323698" path="/var/lib/kubelet/pods/3ca7a411-286a-4af2-bb00-fda2b3323698/volumes" Dec 02 23:26:09 crc kubenswrapper[4903]: I1202 23:26:09.634422 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e790b6-6e19-400a-a329-be2fd76c9e8f" path="/var/lib/kubelet/pods/56e790b6-6e19-400a-a329-be2fd76c9e8f/volumes" Dec 02 23:26:10 crc kubenswrapper[4903]: I1202 23:26:10.612817 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:26:10 crc kubenswrapper[4903]: E1202 23:26:10.613399 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:26:15 crc kubenswrapper[4903]: I1202 23:26:15.033132 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-45qgr"] Dec 02 23:26:15 crc kubenswrapper[4903]: I1202 23:26:15.042490 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-45qgr"] Dec 02 23:26:15 crc kubenswrapper[4903]: I1202 23:26:15.630861 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ba384c-5066-4a2d-a1d6-dbb7090b32c4" path="/var/lib/kubelet/pods/e4ba384c-5066-4a2d-a1d6-dbb7090b32c4/volumes" Dec 02 23:26:21 crc kubenswrapper[4903]: I1202 23:26:21.623962 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:26:21 crc kubenswrapper[4903]: E1202 23:26:21.624913 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:26:36 crc kubenswrapper[4903]: I1202 23:26:36.613491 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:26:36 crc kubenswrapper[4903]: E1202 23:26:36.614788 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:26:38 crc kubenswrapper[4903]: I1202 23:26:38.061996 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-8sbng"] Dec 02 23:26:38 crc kubenswrapper[4903]: I1202 23:26:38.077973 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-8sbng"] Dec 02 23:26:39 crc kubenswrapper[4903]: I1202 23:26:39.633300 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b38561-c9d6-4223-85cd-c4516718cc5f" path="/var/lib/kubelet/pods/25b38561-c9d6-4223-85cd-c4516718cc5f/volumes" Dec 02 23:26:51 crc kubenswrapper[4903]: I1202 23:26:51.621786 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:26:51 crc kubenswrapper[4903]: E1202 23:26:51.622831 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:26:57 crc kubenswrapper[4903]: I1202 23:26:57.033846 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w9hq8"] Dec 02 23:26:57 crc kubenswrapper[4903]: I1202 23:26:57.046676 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w9hq8"] Dec 02 23:26:57 crc kubenswrapper[4903]: I1202 23:26:57.629914 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f91dd36-7138-43f1-8091-cca7410520cf" path="/var/lib/kubelet/pods/9f91dd36-7138-43f1-8091-cca7410520cf/volumes" Dec 02 23:26:59 crc kubenswrapper[4903]: I1202 23:26:59.819486 4903 generic.go:334] "Generic (PLEG): container finished" podID="8dff062c-2479-4ea6-994e-fea352cdf518" containerID="1844919c1da03475a8d60c160e902be69f720407730c0ae05c1fa811423508f9" exitCode=0 Dec 02 23:26:59 crc kubenswrapper[4903]: I1202 23:26:59.819597 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" event={"ID":"8dff062c-2479-4ea6-994e-fea352cdf518","Type":"ContainerDied","Data":"1844919c1da03475a8d60c160e902be69f720407730c0ae05c1fa811423508f9"} Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.368962 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.560247 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory\") pod \"8dff062c-2479-4ea6-994e-fea352cdf518\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.560578 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrjh5\" (UniqueName: \"kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5\") pod \"8dff062c-2479-4ea6-994e-fea352cdf518\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.560877 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key\") pod \"8dff062c-2479-4ea6-994e-fea352cdf518\" (UID: \"8dff062c-2479-4ea6-994e-fea352cdf518\") " Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.568412 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5" (OuterVolumeSpecName: "kube-api-access-vrjh5") pod "8dff062c-2479-4ea6-994e-fea352cdf518" (UID: "8dff062c-2479-4ea6-994e-fea352cdf518"). InnerVolumeSpecName "kube-api-access-vrjh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.606591 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8dff062c-2479-4ea6-994e-fea352cdf518" (UID: "8dff062c-2479-4ea6-994e-fea352cdf518"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.614294 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory" (OuterVolumeSpecName: "inventory") pod "8dff062c-2479-4ea6-994e-fea352cdf518" (UID: "8dff062c-2479-4ea6-994e-fea352cdf518"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.664117 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.664170 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dff062c-2479-4ea6-994e-fea352cdf518-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.664185 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrjh5\" (UniqueName: \"kubernetes.io/projected/8dff062c-2479-4ea6-994e-fea352cdf518-kube-api-access-vrjh5\") on node \"crc\" DevicePath \"\"" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.849209 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" event={"ID":"8dff062c-2479-4ea6-994e-fea352cdf518","Type":"ContainerDied","Data":"784b7f025507517e768c0f8df346ce6644e759cab1536605912fea93480df2d7"} Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.849256 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="784b7f025507517e768c0f8df346ce6644e759cab1536605912fea93480df2d7" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.849274 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.967088 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq"] Dec 02 23:27:01 crc kubenswrapper[4903]: E1202 23:27:01.967950 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dff062c-2479-4ea6-994e-fea352cdf518" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.967982 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dff062c-2479-4ea6-994e-fea352cdf518" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.968339 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dff062c-2479-4ea6-994e-fea352cdf518" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.969452 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.972877 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.973082 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.973189 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.976802 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:27:01 crc kubenswrapper[4903]: I1202 23:27:01.982535 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq"] Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.071462 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.071620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd9c7\" (UniqueName: \"kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.071883 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.173912 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd9c7\" (UniqueName: \"kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.174096 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.174282 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.180205 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.180280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.192524 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd9c7\" (UniqueName: \"kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.302954 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.524554 4903 scope.go:117] "RemoveContainer" containerID="d01c168e227c05710487f7f276542a0866b6bf62f0b03840df6f4017561e84e2" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.553066 4903 scope.go:117] "RemoveContainer" containerID="e9ea438e24ec640f3d149d889d46ed9462572372648a7ece103ebb81299d3930" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.605754 4903 scope.go:117] "RemoveContainer" containerID="14caea3a8661d1cfc5c08135390951b40fb5db2ea7986fa05939c640753ede9e" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.626969 4903 scope.go:117] "RemoveContainer" containerID="f728e0e9e11a13ca8bc676a82f9feb3be8e39805d37ae2c6c6c0a51d3004b3bd" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.670118 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq"] Dec 02 23:27:02 crc kubenswrapper[4903]: W1202 23:27:02.680148 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f7512f_83fe_4921_9ccf_17a76752819f.slice/crio-6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f WatchSource:0}: Error finding container 6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f: Status 404 returned error can't find the container with id 6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.688518 4903 scope.go:117] "RemoveContainer" containerID="bfd7b651f5f338bcab8b4712e40a72ea822356a4cb1e17d5a849d804c0837109" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.711250 4903 scope.go:117] "RemoveContainer" containerID="e290219c57410296cd68e48c28aca458c961ccb2c70ae4ac7b1796c8298f6661" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.730005 4903 scope.go:117] "RemoveContainer" containerID="bb0bfd6db04574e89201628c4c0ee4c20689dcd215f6e9727884504600318436" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.763361 4903 scope.go:117] "RemoveContainer" containerID="db1bfd3257b5f1e05d3b208cdffd9b7e1e712fd70ba4924a81d8649ed5c89b20" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.796382 4903 scope.go:117] "RemoveContainer" containerID="bf1e237745787a5004a76ca5374a049393eb696d258fc3c4bcc72c251cd8d9f6" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.825748 4903 scope.go:117] "RemoveContainer" containerID="2775ec20d7c40c5afb4676e825260170ba46746f893cda55d798477539d26ddd" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.858927 4903 scope.go:117] "RemoveContainer" containerID="dfea6a6a7493885dee015f1130f08f7899b8736900a063101bd2c4227798af70" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.892942 4903 scope.go:117] "RemoveContainer" containerID="b1e2d008ad53752077e45a6af27982f4a516161fb2dd9d442aa386d308ca3f1d" Dec 02 23:27:02 crc kubenswrapper[4903]: I1202 23:27:02.904736 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" event={"ID":"c6f7512f-83fe-4921-9ccf-17a76752819f","Type":"ContainerStarted","Data":"6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f"} Dec 02 23:27:03 crc kubenswrapper[4903]: I1202 23:27:03.947714 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" event={"ID":"c6f7512f-83fe-4921-9ccf-17a76752819f","Type":"ContainerStarted","Data":"7209f6adac303dc44f6b15e7e8d73329434fd4728dc54cd0fa4ed1da3a11d8bc"} Dec 02 23:27:03 crc kubenswrapper[4903]: I1202 23:27:03.971149 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" podStartSLOduration=2.462020587 podStartE2EDuration="2.971120206s" podCreationTimestamp="2025-12-02 23:27:01 +0000 UTC" firstStartedPulling="2025-12-02 23:27:02.687400755 +0000 UTC m=+1761.395955038" lastFinishedPulling="2025-12-02 23:27:03.196500344 +0000 UTC m=+1761.905054657" observedRunningTime="2025-12-02 23:27:03.969480827 +0000 UTC m=+1762.678035140" watchObservedRunningTime="2025-12-02 23:27:03.971120206 +0000 UTC m=+1762.679674529" Dec 02 23:27:05 crc kubenswrapper[4903]: I1202 23:27:05.612918 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:27:05 crc kubenswrapper[4903]: E1202 23:27:05.613729 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:27:09 crc kubenswrapper[4903]: I1202 23:27:09.069232 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bbljr"] Dec 02 23:27:09 crc kubenswrapper[4903]: I1202 23:27:09.079906 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bbljr"] Dec 02 23:27:09 crc kubenswrapper[4903]: I1202 23:27:09.625318 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c0a321-b591-49c3-a9b4-bc6b8bf30820" path="/var/lib/kubelet/pods/92c0a321-b591-49c3-a9b4-bc6b8bf30820/volumes" Dec 02 23:27:10 crc kubenswrapper[4903]: I1202 23:27:10.040648 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bhns8"] Dec 02 23:27:10 crc kubenswrapper[4903]: I1202 23:27:10.053048 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bhns8"] Dec 02 23:27:11 crc kubenswrapper[4903]: I1202 23:27:11.628390 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976c6f69-733c-4046-85e0-d10c9d902a22" path="/var/lib/kubelet/pods/976c6f69-733c-4046-85e0-d10c9d902a22/volumes" Dec 02 23:27:20 crc kubenswrapper[4903]: I1202 23:27:20.613120 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:27:20 crc kubenswrapper[4903]: E1202 23:27:20.613989 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:27:21 crc kubenswrapper[4903]: I1202 23:27:21.049152 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tv6h2"] Dec 02 23:27:21 crc kubenswrapper[4903]: I1202 23:27:21.061938 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tv6h2"] Dec 02 23:27:21 crc kubenswrapper[4903]: I1202 23:27:21.631176 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98900f75-26e7-46cb-a70e-537fa0486fe8" path="/var/lib/kubelet/pods/98900f75-26e7-46cb-a70e-537fa0486fe8/volumes" Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.060223 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-flhtr"] Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.081958 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nsw82"] Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.097900 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-flhtr"] Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.112710 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nsw82"] Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.625830 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fdf589e-17a2-4b53-b68c-f90e884b0080" path="/var/lib/kubelet/pods/5fdf589e-17a2-4b53-b68c-f90e884b0080/volumes" Dec 02 23:27:23 crc kubenswrapper[4903]: I1202 23:27:23.628763 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdb8e0b-8b04-4dc5-b532-8e68e8206122" path="/var/lib/kubelet/pods/ecdb8e0b-8b04-4dc5-b532-8e68e8206122/volumes" Dec 02 23:27:34 crc kubenswrapper[4903]: I1202 23:27:34.622469 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:27:34 crc kubenswrapper[4903]: E1202 23:27:34.634214 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:27:45 crc kubenswrapper[4903]: I1202 23:27:45.613216 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:27:45 crc kubenswrapper[4903]: E1202 23:27:45.614157 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:28:00 crc kubenswrapper[4903]: I1202 23:28:00.613332 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:28:00 crc kubenswrapper[4903]: E1202 23:28:00.614496 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:28:03 crc kubenswrapper[4903]: I1202 23:28:03.178748 4903 scope.go:117] "RemoveContainer" containerID="5c3d7d10b0a60c8b39f1ec55f86abf1ac102b636038fcd4ff5a22eedb548f872" Dec 02 23:28:03 crc kubenswrapper[4903]: I1202 23:28:03.217080 4903 scope.go:117] "RemoveContainer" containerID="1bce8edcc376cc44be897d0c5ea65e04b8d457cc31894966db288539082aa040" Dec 02 23:28:03 crc kubenswrapper[4903]: I1202 23:28:03.266912 4903 scope.go:117] "RemoveContainer" containerID="6cdff083596655b0abf2505b6ad88953f08c3bb522280a97d7f3c46d2298e096" Dec 02 23:28:03 crc kubenswrapper[4903]: I1202 23:28:03.315963 4903 scope.go:117] "RemoveContainer" containerID="466c9dfeffe342ed9c339eb847516a6e79456aa45aa2732dc102bda92052c254" Dec 02 23:28:03 crc kubenswrapper[4903]: I1202 23:28:03.346399 4903 scope.go:117] "RemoveContainer" containerID="d179cc0a9b1b2a14d63276f5a6fb227b062e6d5252e8870e3f259cdeadee12f5" Dec 02 23:28:15 crc kubenswrapper[4903]: I1202 23:28:15.613265 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:28:15 crc kubenswrapper[4903]: E1202 23:28:15.614891 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:28:16 crc kubenswrapper[4903]: I1202 23:28:16.049567 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tmcbs"] Dec 02 23:28:16 crc kubenswrapper[4903]: I1202 23:28:16.063581 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tmcbs"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.033985 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b933-account-create-update-ds9hn"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.043003 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bd18-account-create-update-hzb44"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.053989 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8krqs"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.063054 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7e77-account-create-update-gvv2z"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.073309 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b933-account-create-update-ds9hn"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.083402 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bd18-account-create-update-hzb44"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.092822 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8krqs"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.101817 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7e77-account-create-update-gvv2z"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.109446 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dbf6h"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.117293 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dbf6h"] Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.630171 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d426b48-454f-49a9-8be0-2fea7237ff7c" path="/var/lib/kubelet/pods/1d426b48-454f-49a9-8be0-2fea7237ff7c/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.631625 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e227b00-37a8-409c-916d-3f6d49661795" path="/var/lib/kubelet/pods/4e227b00-37a8-409c-916d-3f6d49661795/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.632560 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5159b672-4f54-4a9d-a658-bee025a03797" path="/var/lib/kubelet/pods/5159b672-4f54-4a9d-a658-bee025a03797/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.633802 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e431bf-fdd1-48de-bccc-ea5b78e37e1a" path="/var/lib/kubelet/pods/68e431bf-fdd1-48de-bccc-ea5b78e37e1a/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.636320 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2acd090-402b-4468-ab99-f0c41a763812" path="/var/lib/kubelet/pods/e2acd090-402b-4468-ab99-f0c41a763812/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.637071 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8" path="/var/lib/kubelet/pods/eb7516ea-94a3-4d0c-bea5-cd7c0ee303e8/volumes" Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.778196 4903 generic.go:334] "Generic (PLEG): container finished" podID="c6f7512f-83fe-4921-9ccf-17a76752819f" containerID="7209f6adac303dc44f6b15e7e8d73329434fd4728dc54cd0fa4ed1da3a11d8bc" exitCode=0 Dec 02 23:28:17 crc kubenswrapper[4903]: I1202 23:28:17.778250 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" event={"ID":"c6f7512f-83fe-4921-9ccf-17a76752819f","Type":"ContainerDied","Data":"7209f6adac303dc44f6b15e7e8d73329434fd4728dc54cd0fa4ed1da3a11d8bc"} Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.362761 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.533784 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd9c7\" (UniqueName: \"kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7\") pod \"c6f7512f-83fe-4921-9ccf-17a76752819f\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.534134 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key\") pod \"c6f7512f-83fe-4921-9ccf-17a76752819f\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.534496 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory\") pod \"c6f7512f-83fe-4921-9ccf-17a76752819f\" (UID: \"c6f7512f-83fe-4921-9ccf-17a76752819f\") " Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.543240 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7" (OuterVolumeSpecName: "kube-api-access-qd9c7") pod "c6f7512f-83fe-4921-9ccf-17a76752819f" (UID: "c6f7512f-83fe-4921-9ccf-17a76752819f"). InnerVolumeSpecName "kube-api-access-qd9c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.561255 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6f7512f-83fe-4921-9ccf-17a76752819f" (UID: "c6f7512f-83fe-4921-9ccf-17a76752819f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.571180 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory" (OuterVolumeSpecName: "inventory") pod "c6f7512f-83fe-4921-9ccf-17a76752819f" (UID: "c6f7512f-83fe-4921-9ccf-17a76752819f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.636522 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd9c7\" (UniqueName: \"kubernetes.io/projected/c6f7512f-83fe-4921-9ccf-17a76752819f-kube-api-access-qd9c7\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.636556 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.636664 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f7512f-83fe-4921-9ccf-17a76752819f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.799490 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" event={"ID":"c6f7512f-83fe-4921-9ccf-17a76752819f","Type":"ContainerDied","Data":"6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f"} Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.799531 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1d997466b7e5ae5e90bfa24e4eaf4c9429cad6a19b7f7c72387454d6e2e87f" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.799571 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.898863 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w"] Dec 02 23:28:19 crc kubenswrapper[4903]: E1202 23:28:19.899391 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f7512f-83fe-4921-9ccf-17a76752819f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.899413 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f7512f-83fe-4921-9ccf-17a76752819f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.899700 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f7512f-83fe-4921-9ccf-17a76752819f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.900534 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.903168 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.903442 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.906786 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.907425 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:28:19 crc kubenswrapper[4903]: I1202 23:28:19.920544 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w"] Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.046642 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzg7\" (UniqueName: \"kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.046709 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.046833 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.148160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzg7\" (UniqueName: \"kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.148428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.148480 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.152371 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.159475 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.166579 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzg7\" (UniqueName: \"kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xf62w\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.223106 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.563849 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w"] Dec 02 23:28:20 crc kubenswrapper[4903]: W1202 23:28:20.564589 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd727ee19_e1d6_4421_9be6_94f429f93494.slice/crio-7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa WatchSource:0}: Error finding container 7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa: Status 404 returned error can't find the container with id 7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa Dec 02 23:28:20 crc kubenswrapper[4903]: I1202 23:28:20.811708 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" event={"ID":"d727ee19-e1d6-4421-9be6-94f429f93494","Type":"ContainerStarted","Data":"7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa"} Dec 02 23:28:21 crc kubenswrapper[4903]: I1202 23:28:21.821086 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" event={"ID":"d727ee19-e1d6-4421-9be6-94f429f93494","Type":"ContainerStarted","Data":"5b17cd7ffb3723bedee5a10a74e891a1969385b63c90f2120480938d880eaa84"} Dec 02 23:28:21 crc kubenswrapper[4903]: I1202 23:28:21.850603 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" podStartSLOduration=2.218738495 podStartE2EDuration="2.85058075s" podCreationTimestamp="2025-12-02 23:28:19 +0000 UTC" firstStartedPulling="2025-12-02 23:28:20.567537114 +0000 UTC m=+1839.276091397" lastFinishedPulling="2025-12-02 23:28:21.199379329 +0000 UTC m=+1839.907933652" observedRunningTime="2025-12-02 23:28:21.834401159 +0000 UTC m=+1840.542955442" watchObservedRunningTime="2025-12-02 23:28:21.85058075 +0000 UTC m=+1840.559135043" Dec 02 23:28:26 crc kubenswrapper[4903]: I1202 23:28:26.879923 4903 generic.go:334] "Generic (PLEG): container finished" podID="d727ee19-e1d6-4421-9be6-94f429f93494" containerID="5b17cd7ffb3723bedee5a10a74e891a1969385b63c90f2120480938d880eaa84" exitCode=0 Dec 02 23:28:26 crc kubenswrapper[4903]: I1202 23:28:26.880047 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" event={"ID":"d727ee19-e1d6-4421-9be6-94f429f93494","Type":"ContainerDied","Data":"5b17cd7ffb3723bedee5a10a74e891a1969385b63c90f2120480938d880eaa84"} Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.469888 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.613568 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.644552 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory\") pod \"d727ee19-e1d6-4421-9be6-94f429f93494\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.644921 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key\") pod \"d727ee19-e1d6-4421-9be6-94f429f93494\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.645093 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzg7\" (UniqueName: \"kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7\") pod \"d727ee19-e1d6-4421-9be6-94f429f93494\" (UID: \"d727ee19-e1d6-4421-9be6-94f429f93494\") " Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.653097 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7" (OuterVolumeSpecName: "kube-api-access-4xzg7") pod "d727ee19-e1d6-4421-9be6-94f429f93494" (UID: "d727ee19-e1d6-4421-9be6-94f429f93494"). InnerVolumeSpecName "kube-api-access-4xzg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.688057 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory" (OuterVolumeSpecName: "inventory") pod "d727ee19-e1d6-4421-9be6-94f429f93494" (UID: "d727ee19-e1d6-4421-9be6-94f429f93494"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.710052 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d727ee19-e1d6-4421-9be6-94f429f93494" (UID: "d727ee19-e1d6-4421-9be6-94f429f93494"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.747941 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.748700 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d727ee19-e1d6-4421-9be6-94f429f93494-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.748770 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzg7\" (UniqueName: \"kubernetes.io/projected/d727ee19-e1d6-4421-9be6-94f429f93494-kube-api-access-4xzg7\") on node \"crc\" DevicePath \"\"" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.935262 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.935940 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xf62w" event={"ID":"d727ee19-e1d6-4421-9be6-94f429f93494","Type":"ContainerDied","Data":"7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa"} Dec 02 23:28:28 crc kubenswrapper[4903]: I1202 23:28:28.936006 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7abdc3a2249068c31cdb7918ba356ddf0c877d36376dd7db1b8ba89e650e5afa" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.005435 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx"] Dec 02 23:28:29 crc kubenswrapper[4903]: E1202 23:28:29.006115 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d727ee19-e1d6-4421-9be6-94f429f93494" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.006138 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d727ee19-e1d6-4421-9be6-94f429f93494" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.006505 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d727ee19-e1d6-4421-9be6-94f429f93494" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.007679 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.015033 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx"] Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.042731 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.042978 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.045177 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.046129 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.156617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.156733 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssgd\" (UniqueName: \"kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.156768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.258408 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.258745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssgd\" (UniqueName: \"kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.258772 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.262978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.263156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.280384 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssgd\" (UniqueName: \"kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p8wmx\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.361158 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.735685 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx"] Dec 02 23:28:29 crc kubenswrapper[4903]: W1202 23:28:29.738194 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346ac594_16d6_478e_9ce4_4d4acb116a99.slice/crio-4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a WatchSource:0}: Error finding container 4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a: Status 404 returned error can't find the container with id 4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.947977 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a"} Dec 02 23:28:29 crc kubenswrapper[4903]: I1202 23:28:29.950285 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" event={"ID":"346ac594-16d6-478e-9ce4-4d4acb116a99","Type":"ContainerStarted","Data":"4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a"} Dec 02 23:28:30 crc kubenswrapper[4903]: I1202 23:28:30.961666 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" event={"ID":"346ac594-16d6-478e-9ce4-4d4acb116a99","Type":"ContainerStarted","Data":"82a0b670bf2b9ff8429ee125a2db1d00f8e4df644582613f5b4c3507c449956a"} Dec 02 23:28:30 crc kubenswrapper[4903]: I1202 23:28:30.987930 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" podStartSLOduration=2.529990171 podStartE2EDuration="2.987909954s" podCreationTimestamp="2025-12-02 23:28:28 +0000 UTC" firstStartedPulling="2025-12-02 23:28:29.741908243 +0000 UTC m=+1848.450462546" lastFinishedPulling="2025-12-02 23:28:30.199828046 +0000 UTC m=+1848.908382329" observedRunningTime="2025-12-02 23:28:30.981753814 +0000 UTC m=+1849.690308097" watchObservedRunningTime="2025-12-02 23:28:30.987909954 +0000 UTC m=+1849.696464237" Dec 02 23:28:59 crc kubenswrapper[4903]: I1202 23:28:59.056252 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hjz85"] Dec 02 23:28:59 crc kubenswrapper[4903]: I1202 23:28:59.067806 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hjz85"] Dec 02 23:28:59 crc kubenswrapper[4903]: I1202 23:28:59.625774 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af62260-60e9-49b0-84b9-3f9cf7361c79" path="/var/lib/kubelet/pods/6af62260-60e9-49b0-84b9-3f9cf7361c79/volumes" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.520394 4903 scope.go:117] "RemoveContainer" containerID="fa1c51ff21fc1b98f42c9d5dfa0bd48272f3369c37eb8fd63c01cc20e31852e7" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.550697 4903 scope.go:117] "RemoveContainer" containerID="c61fcdcc06287d81c4c5fe20e7579184599004e4db4b0c1426445f36df8ea73f" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.620144 4903 scope.go:117] "RemoveContainer" containerID="a1a8ac2e73f6ddeab0eecb16ff1ff5a24d2e1c62c7b288a33df9f8022e629187" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.675096 4903 scope.go:117] "RemoveContainer" containerID="09f85e7fba69c9268740ef51063c02d51d631cddd70a7cf69eece3db5c7dcbe8" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.726134 4903 scope.go:117] "RemoveContainer" containerID="d386d1ae453579c83224d3f8ba3eb5222349c9b28c054ea10002429664d26afd" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.781637 4903 scope.go:117] "RemoveContainer" containerID="cc0155c9d4cbbc86352c6be117bc554a3e26d566a6928fa8e85b3dad046a270c" Dec 02 23:29:03 crc kubenswrapper[4903]: I1202 23:29:03.834814 4903 scope.go:117] "RemoveContainer" containerID="d1518b53fd06b9129dac97a57713df93493c07251bdf5f6b59b90c7a977c583b" Dec 02 23:29:12 crc kubenswrapper[4903]: I1202 23:29:12.411130 4903 generic.go:334] "Generic (PLEG): container finished" podID="346ac594-16d6-478e-9ce4-4d4acb116a99" containerID="82a0b670bf2b9ff8429ee125a2db1d00f8e4df644582613f5b4c3507c449956a" exitCode=0 Dec 02 23:29:12 crc kubenswrapper[4903]: I1202 23:29:12.411237 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" event={"ID":"346ac594-16d6-478e-9ce4-4d4acb116a99","Type":"ContainerDied","Data":"82a0b670bf2b9ff8429ee125a2db1d00f8e4df644582613f5b4c3507c449956a"} Dec 02 23:29:13 crc kubenswrapper[4903]: I1202 23:29:13.869253 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:29:13 crc kubenswrapper[4903]: I1202 23:29:13.977340 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key\") pod \"346ac594-16d6-478e-9ce4-4d4acb116a99\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " Dec 02 23:29:13 crc kubenswrapper[4903]: I1202 23:29:13.977506 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kssgd\" (UniqueName: \"kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd\") pod \"346ac594-16d6-478e-9ce4-4d4acb116a99\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " Dec 02 23:29:13 crc kubenswrapper[4903]: I1202 23:29:13.977586 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory\") pod \"346ac594-16d6-478e-9ce4-4d4acb116a99\" (UID: \"346ac594-16d6-478e-9ce4-4d4acb116a99\") " Dec 02 23:29:13 crc kubenswrapper[4903]: I1202 23:29:13.986092 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd" (OuterVolumeSpecName: "kube-api-access-kssgd") pod "346ac594-16d6-478e-9ce4-4d4acb116a99" (UID: "346ac594-16d6-478e-9ce4-4d4acb116a99"). InnerVolumeSpecName "kube-api-access-kssgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.008474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory" (OuterVolumeSpecName: "inventory") pod "346ac594-16d6-478e-9ce4-4d4acb116a99" (UID: "346ac594-16d6-478e-9ce4-4d4acb116a99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.022166 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "346ac594-16d6-478e-9ce4-4d4acb116a99" (UID: "346ac594-16d6-478e-9ce4-4d4acb116a99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.083868 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.083952 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kssgd\" (UniqueName: \"kubernetes.io/projected/346ac594-16d6-478e-9ce4-4d4acb116a99-kube-api-access-kssgd\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.083975 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/346ac594-16d6-478e-9ce4-4d4acb116a99-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.441044 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" event={"ID":"346ac594-16d6-478e-9ce4-4d4acb116a99","Type":"ContainerDied","Data":"4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a"} Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.441125 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a72c4c976aa0a120195aa8ae36efa031e760cdeb657649d8ddf18e252511c4a" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.441845 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p8wmx" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.553472 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv"] Dec 02 23:29:14 crc kubenswrapper[4903]: E1202 23:29:14.554253 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346ac594-16d6-478e-9ce4-4d4acb116a99" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.554286 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="346ac594-16d6-478e-9ce4-4d4acb116a99" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.554715 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="346ac594-16d6-478e-9ce4-4d4acb116a99" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.555987 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.558178 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.562375 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.562777 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.563155 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.564931 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv"] Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.592560 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.592899 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.593051 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnsq\" (UniqueName: \"kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.695343 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.695569 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.695805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnsq\" (UniqueName: \"kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.702138 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.703021 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.717927 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnsq\" (UniqueName: \"kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:14 crc kubenswrapper[4903]: I1202 23:29:14.885030 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:29:15 crc kubenswrapper[4903]: I1202 23:29:15.323506 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv"] Dec 02 23:29:15 crc kubenswrapper[4903]: W1202 23:29:15.329411 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6700daa_2dac_4779_a463_6aea7ae0d54a.slice/crio-aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7 WatchSource:0}: Error finding container aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7: Status 404 returned error can't find the container with id aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7 Dec 02 23:29:15 crc kubenswrapper[4903]: I1202 23:29:15.332974 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:29:15 crc kubenswrapper[4903]: I1202 23:29:15.450728 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" event={"ID":"d6700daa-2dac-4779-a463-6aea7ae0d54a","Type":"ContainerStarted","Data":"aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7"} Dec 02 23:29:16 crc kubenswrapper[4903]: I1202 23:29:16.466734 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" event={"ID":"d6700daa-2dac-4779-a463-6aea7ae0d54a","Type":"ContainerStarted","Data":"d9ca801d5e663eb7cf6f197b774e0687932ebac81e507ebf77da4c43d862ade7"} Dec 02 23:29:16 crc kubenswrapper[4903]: I1202 23:29:16.506628 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" podStartSLOduration=2.062139389 podStartE2EDuration="2.506603347s" podCreationTimestamp="2025-12-02 23:29:14 +0000 UTC" firstStartedPulling="2025-12-02 23:29:15.332737754 +0000 UTC m=+1894.041292037" lastFinishedPulling="2025-12-02 23:29:15.777201712 +0000 UTC m=+1894.485755995" observedRunningTime="2025-12-02 23:29:16.488437729 +0000 UTC m=+1895.196992062" watchObservedRunningTime="2025-12-02 23:29:16.506603347 +0000 UTC m=+1895.215157640" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.700677 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.704026 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.728143 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.888645 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.888720 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.888825 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79x8\" (UniqueName: \"kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.992022 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.992286 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79x8\" (UniqueName: \"kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.992531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.992545 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:18 crc kubenswrapper[4903]: I1202 23:29:18.992987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:19 crc kubenswrapper[4903]: I1202 23:29:19.013319 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79x8\" (UniqueName: \"kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8\") pod \"certified-operators-s85mr\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:19 crc kubenswrapper[4903]: I1202 23:29:19.045583 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:19 crc kubenswrapper[4903]: I1202 23:29:19.571081 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:19 crc kubenswrapper[4903]: W1202 23:29:19.576705 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b37f872_d5b5_4f1c_811d_aeaedb638024.slice/crio-75c0aef76bc3cf42866139bc82c3951b0f77c50d9e955211186a3e3f869c73ca WatchSource:0}: Error finding container 75c0aef76bc3cf42866139bc82c3951b0f77c50d9e955211186a3e3f869c73ca: Status 404 returned error can't find the container with id 75c0aef76bc3cf42866139bc82c3951b0f77c50d9e955211186a3e3f869c73ca Dec 02 23:29:20 crc kubenswrapper[4903]: I1202 23:29:20.514574 4903 generic.go:334] "Generic (PLEG): container finished" podID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerID="d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6" exitCode=0 Dec 02 23:29:20 crc kubenswrapper[4903]: I1202 23:29:20.514730 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerDied","Data":"d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6"} Dec 02 23:29:20 crc kubenswrapper[4903]: I1202 23:29:20.514953 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerStarted","Data":"75c0aef76bc3cf42866139bc82c3951b0f77c50d9e955211186a3e3f869c73ca"} Dec 02 23:29:21 crc kubenswrapper[4903]: I1202 23:29:21.527788 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerStarted","Data":"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2"} Dec 02 23:29:23 crc kubenswrapper[4903]: I1202 23:29:23.052956 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lltzt"] Dec 02 23:29:23 crc kubenswrapper[4903]: I1202 23:29:23.068919 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lltzt"] Dec 02 23:29:23 crc kubenswrapper[4903]: I1202 23:29:23.549875 4903 generic.go:334] "Generic (PLEG): container finished" podID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerID="32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2" exitCode=0 Dec 02 23:29:23 crc kubenswrapper[4903]: I1202 23:29:23.549937 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerDied","Data":"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2"} Dec 02 23:29:23 crc kubenswrapper[4903]: I1202 23:29:23.633609 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2607960d-5ee6-4c49-9c3c-3b8083b4bb9e" path="/var/lib/kubelet/pods/2607960d-5ee6-4c49-9c3c-3b8083b4bb9e/volumes" Dec 02 23:29:24 crc kubenswrapper[4903]: I1202 23:29:24.033899 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-t5jc4"] Dec 02 23:29:24 crc kubenswrapper[4903]: I1202 23:29:24.046196 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-t5jc4"] Dec 02 23:29:24 crc kubenswrapper[4903]: I1202 23:29:24.560795 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerStarted","Data":"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51"} Dec 02 23:29:24 crc kubenswrapper[4903]: I1202 23:29:24.582445 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s85mr" podStartSLOduration=3.094823629 podStartE2EDuration="6.582420816s" podCreationTimestamp="2025-12-02 23:29:18 +0000 UTC" firstStartedPulling="2025-12-02 23:29:20.516763317 +0000 UTC m=+1899.225317600" lastFinishedPulling="2025-12-02 23:29:24.004360504 +0000 UTC m=+1902.712914787" observedRunningTime="2025-12-02 23:29:24.579994047 +0000 UTC m=+1903.288548330" watchObservedRunningTime="2025-12-02 23:29:24.582420816 +0000 UTC m=+1903.290975119" Dec 02 23:29:25 crc kubenswrapper[4903]: I1202 23:29:25.624301 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7768c2b-8cda-4ff7-b845-d2762445cb9e" path="/var/lib/kubelet/pods/b7768c2b-8cda-4ff7-b845-d2762445cb9e/volumes" Dec 02 23:29:29 crc kubenswrapper[4903]: I1202 23:29:29.046602 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:29 crc kubenswrapper[4903]: I1202 23:29:29.046932 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:29 crc kubenswrapper[4903]: I1202 23:29:29.133248 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:29 crc kubenswrapper[4903]: I1202 23:29:29.711105 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:32 crc kubenswrapper[4903]: I1202 23:29:32.695033 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:32 crc kubenswrapper[4903]: I1202 23:29:32.696212 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s85mr" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="registry-server" containerID="cri-o://71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51" gracePeriod=2 Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.155550 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.307491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d79x8\" (UniqueName: \"kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8\") pod \"3b37f872-d5b5-4f1c-811d-aeaedb638024\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.307766 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities\") pod \"3b37f872-d5b5-4f1c-811d-aeaedb638024\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.307825 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content\") pod \"3b37f872-d5b5-4f1c-811d-aeaedb638024\" (UID: \"3b37f872-d5b5-4f1c-811d-aeaedb638024\") " Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.308857 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities" (OuterVolumeSpecName: "utilities") pod "3b37f872-d5b5-4f1c-811d-aeaedb638024" (UID: "3b37f872-d5b5-4f1c-811d-aeaedb638024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.315463 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8" (OuterVolumeSpecName: "kube-api-access-d79x8") pod "3b37f872-d5b5-4f1c-811d-aeaedb638024" (UID: "3b37f872-d5b5-4f1c-811d-aeaedb638024"). InnerVolumeSpecName "kube-api-access-d79x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.366471 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b37f872-d5b5-4f1c-811d-aeaedb638024" (UID: "3b37f872-d5b5-4f1c-811d-aeaedb638024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.412146 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.412232 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b37f872-d5b5-4f1c-811d-aeaedb638024-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.412256 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d79x8\" (UniqueName: \"kubernetes.io/projected/3b37f872-d5b5-4f1c-811d-aeaedb638024-kube-api-access-d79x8\") on node \"crc\" DevicePath \"\"" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.700369 4903 generic.go:334] "Generic (PLEG): container finished" podID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerID="71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51" exitCode=0 Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.700417 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerDied","Data":"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51"} Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.700429 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85mr" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.700453 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85mr" event={"ID":"3b37f872-d5b5-4f1c-811d-aeaedb638024","Type":"ContainerDied","Data":"75c0aef76bc3cf42866139bc82c3951b0f77c50d9e955211186a3e3f869c73ca"} Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.700479 4903 scope.go:117] "RemoveContainer" containerID="71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.730754 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.739058 4903 scope.go:117] "RemoveContainer" containerID="32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.740767 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s85mr"] Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.767840 4903 scope.go:117] "RemoveContainer" containerID="d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.837002 4903 scope.go:117] "RemoveContainer" containerID="71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51" Dec 02 23:29:33 crc kubenswrapper[4903]: E1202 23:29:33.837867 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51\": container with ID starting with 71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51 not found: ID does not exist" containerID="71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.837914 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51"} err="failed to get container status \"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51\": rpc error: code = NotFound desc = could not find container \"71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51\": container with ID starting with 71a4043c01c0819e4d25ece8e9c4a823f422cfe320424827f05a981c7dc39a51 not found: ID does not exist" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.837945 4903 scope.go:117] "RemoveContainer" containerID="32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2" Dec 02 23:29:33 crc kubenswrapper[4903]: E1202 23:29:33.838720 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2\": container with ID starting with 32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2 not found: ID does not exist" containerID="32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.838880 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2"} err="failed to get container status \"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2\": rpc error: code = NotFound desc = could not find container \"32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2\": container with ID starting with 32b7327f8c58c018faf42b51affecba3459ca53b5a9843a55630757c1ce34ad2 not found: ID does not exist" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.838957 4903 scope.go:117] "RemoveContainer" containerID="d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6" Dec 02 23:29:33 crc kubenswrapper[4903]: E1202 23:29:33.839387 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6\": container with ID starting with d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6 not found: ID does not exist" containerID="d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6" Dec 02 23:29:33 crc kubenswrapper[4903]: I1202 23:29:33.839534 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6"} err="failed to get container status \"d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6\": rpc error: code = NotFound desc = could not find container \"d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6\": container with ID starting with d5f62517d615963c81187aa90b01848609c4409a63f7a01bb8dc0df312ce77d6 not found: ID does not exist" Dec 02 23:29:35 crc kubenswrapper[4903]: I1202 23:29:35.633362 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" path="/var/lib/kubelet/pods/3b37f872-d5b5-4f1c-811d-aeaedb638024/volumes" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.154405 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb"] Dec 02 23:30:00 crc kubenswrapper[4903]: E1202 23:30:00.155490 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="registry-server" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.155507 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="registry-server" Dec 02 23:30:00 crc kubenswrapper[4903]: E1202 23:30:00.155543 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="extract-content" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.155551 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="extract-content" Dec 02 23:30:00 crc kubenswrapper[4903]: E1202 23:30:00.155570 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="extract-utilities" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.155578 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="extract-utilities" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.155864 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b37f872-d5b5-4f1c-811d-aeaedb638024" containerName="registry-server" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.156751 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.159048 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.162336 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb"] Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.164299 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.287478 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptdzd\" (UniqueName: \"kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.287578 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.287752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.389882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.389950 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptdzd\" (UniqueName: \"kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.389998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.390894 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.399517 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.408555 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptdzd\" (UniqueName: \"kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd\") pod \"collect-profiles-29411970-crlbb\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.483286 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:00 crc kubenswrapper[4903]: I1202 23:30:00.996712 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb"] Dec 02 23:30:01 crc kubenswrapper[4903]: W1202 23:30:01.002102 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb6ab4e_3770_47d1_8796_6a58ea453293.slice/crio-379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a WatchSource:0}: Error finding container 379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a: Status 404 returned error can't find the container with id 379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a Dec 02 23:30:02 crc kubenswrapper[4903]: I1202 23:30:02.011847 4903 generic.go:334] "Generic (PLEG): container finished" podID="3cb6ab4e-3770-47d1-8796-6a58ea453293" containerID="631ec00f5442f7c8ac19f024c1902e27f55e9ddc055a4c772966fd86d18c3f1f" exitCode=0 Dec 02 23:30:02 crc kubenswrapper[4903]: I1202 23:30:02.011942 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" event={"ID":"3cb6ab4e-3770-47d1-8796-6a58ea453293","Type":"ContainerDied","Data":"631ec00f5442f7c8ac19f024c1902e27f55e9ddc055a4c772966fd86d18c3f1f"} Dec 02 23:30:02 crc kubenswrapper[4903]: I1202 23:30:02.011996 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" event={"ID":"3cb6ab4e-3770-47d1-8796-6a58ea453293","Type":"ContainerStarted","Data":"379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a"} Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.437851 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.476374 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptdzd\" (UniqueName: \"kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd\") pod \"3cb6ab4e-3770-47d1-8796-6a58ea453293\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.476460 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume\") pod \"3cb6ab4e-3770-47d1-8796-6a58ea453293\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.476505 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume\") pod \"3cb6ab4e-3770-47d1-8796-6a58ea453293\" (UID: \"3cb6ab4e-3770-47d1-8796-6a58ea453293\") " Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.477242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume" (OuterVolumeSpecName: "config-volume") pod "3cb6ab4e-3770-47d1-8796-6a58ea453293" (UID: "3cb6ab4e-3770-47d1-8796-6a58ea453293"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.482642 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3cb6ab4e-3770-47d1-8796-6a58ea453293" (UID: "3cb6ab4e-3770-47d1-8796-6a58ea453293"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.483283 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd" (OuterVolumeSpecName: "kube-api-access-ptdzd") pod "3cb6ab4e-3770-47d1-8796-6a58ea453293" (UID: "3cb6ab4e-3770-47d1-8796-6a58ea453293"). InnerVolumeSpecName "kube-api-access-ptdzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.578264 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptdzd\" (UniqueName: \"kubernetes.io/projected/3cb6ab4e-3770-47d1-8796-6a58ea453293-kube-api-access-ptdzd\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.578296 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cb6ab4e-3770-47d1-8796-6a58ea453293-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:03 crc kubenswrapper[4903]: I1202 23:30:03.578307 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cb6ab4e-3770-47d1-8796-6a58ea453293-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:04 crc kubenswrapper[4903]: I1202 23:30:04.024422 4903 scope.go:117] "RemoveContainer" containerID="0430dcc5aa8a1e41c2dfa5b2c7b415e87fce339154a8388b1654d601156c82f7" Dec 02 23:30:04 crc kubenswrapper[4903]: I1202 23:30:04.033034 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" event={"ID":"3cb6ab4e-3770-47d1-8796-6a58ea453293","Type":"ContainerDied","Data":"379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a"} Dec 02 23:30:04 crc kubenswrapper[4903]: I1202 23:30:04.033100 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379f8875b347864997b01635aa7e3820e5e9b127ef0e127bccf37c534715fa6a" Dec 02 23:30:04 crc kubenswrapper[4903]: I1202 23:30:04.033068 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb" Dec 02 23:30:04 crc kubenswrapper[4903]: I1202 23:30:04.075456 4903 scope.go:117] "RemoveContainer" containerID="af32ab0adc21ad36d3f3f69152af8e2bd8d59cb9db0dc0caff8d04b157f830b4" Dec 02 23:30:06 crc kubenswrapper[4903]: I1202 23:30:06.043137 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-prs2t"] Dec 02 23:30:06 crc kubenswrapper[4903]: I1202 23:30:06.056757 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-prs2t"] Dec 02 23:30:07 crc kubenswrapper[4903]: I1202 23:30:07.628987 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ad3f2c-68c0-426e-94b6-999ea0629dcd" path="/var/lib/kubelet/pods/b3ad3f2c-68c0-426e-94b6-999ea0629dcd/volumes" Dec 02 23:30:16 crc kubenswrapper[4903]: I1202 23:30:16.155958 4903 generic.go:334] "Generic (PLEG): container finished" podID="d6700daa-2dac-4779-a463-6aea7ae0d54a" containerID="d9ca801d5e663eb7cf6f197b774e0687932ebac81e507ebf77da4c43d862ade7" exitCode=0 Dec 02 23:30:16 crc kubenswrapper[4903]: I1202 23:30:16.156082 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" event={"ID":"d6700daa-2dac-4779-a463-6aea7ae0d54a","Type":"ContainerDied","Data":"d9ca801d5e663eb7cf6f197b774e0687932ebac81e507ebf77da4c43d862ade7"} Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.603851 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.775011 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory\") pod \"d6700daa-2dac-4779-a463-6aea7ae0d54a\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.775434 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key\") pod \"d6700daa-2dac-4779-a463-6aea7ae0d54a\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.775581 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnsq\" (UniqueName: \"kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq\") pod \"d6700daa-2dac-4779-a463-6aea7ae0d54a\" (UID: \"d6700daa-2dac-4779-a463-6aea7ae0d54a\") " Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.782402 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq" (OuterVolumeSpecName: "kube-api-access-5jnsq") pod "d6700daa-2dac-4779-a463-6aea7ae0d54a" (UID: "d6700daa-2dac-4779-a463-6aea7ae0d54a"). InnerVolumeSpecName "kube-api-access-5jnsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.804024 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6700daa-2dac-4779-a463-6aea7ae0d54a" (UID: "d6700daa-2dac-4779-a463-6aea7ae0d54a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.831279 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory" (OuterVolumeSpecName: "inventory") pod "d6700daa-2dac-4779-a463-6aea7ae0d54a" (UID: "d6700daa-2dac-4779-a463-6aea7ae0d54a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.878165 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnsq\" (UniqueName: \"kubernetes.io/projected/d6700daa-2dac-4779-a463-6aea7ae0d54a-kube-api-access-5jnsq\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.878221 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:17 crc kubenswrapper[4903]: I1202 23:30:17.878234 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6700daa-2dac-4779-a463-6aea7ae0d54a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.176467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" event={"ID":"d6700daa-2dac-4779-a463-6aea7ae0d54a","Type":"ContainerDied","Data":"aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7"} Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.176542 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebc453f05e97caec363e7143eda0185ec1f4e3e70c8ef9f7c00926553955ab7" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.176491 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.296747 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5s77w"] Dec 02 23:30:18 crc kubenswrapper[4903]: E1202 23:30:18.297464 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb6ab4e-3770-47d1-8796-6a58ea453293" containerName="collect-profiles" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.297494 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb6ab4e-3770-47d1-8796-6a58ea453293" containerName="collect-profiles" Dec 02 23:30:18 crc kubenswrapper[4903]: E1202 23:30:18.297515 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6700daa-2dac-4779-a463-6aea7ae0d54a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.297530 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6700daa-2dac-4779-a463-6aea7ae0d54a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.297957 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb6ab4e-3770-47d1-8796-6a58ea453293" containerName="collect-profiles" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.298005 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6700daa-2dac-4779-a463-6aea7ae0d54a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.299120 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.301251 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.302101 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.302234 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.302349 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.341944 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5s77w"] Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.390437 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.390909 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.391081 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnw6w\" (UniqueName: \"kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.492833 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.492963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.492989 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnw6w\" (UniqueName: \"kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.499151 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.501047 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.510314 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnw6w\" (UniqueName: \"kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w\") pod \"ssh-known-hosts-edpm-deployment-5s77w\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:18 crc kubenswrapper[4903]: I1202 23:30:18.634594 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:19 crc kubenswrapper[4903]: I1202 23:30:19.280375 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5s77w"] Dec 02 23:30:20 crc kubenswrapper[4903]: I1202 23:30:20.200620 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" event={"ID":"70d917fc-dbd8-499d-bcae-b5f324de77cb","Type":"ContainerStarted","Data":"54664eb520b45796df686f988dcdb6c4370e05eae70cfaa7c274a33da85ba439"} Dec 02 23:30:20 crc kubenswrapper[4903]: I1202 23:30:20.200977 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" event={"ID":"70d917fc-dbd8-499d-bcae-b5f324de77cb","Type":"ContainerStarted","Data":"3c048b2930fec275d726726d2dd12e4f589d80e7e19484a3a6a7e68cb94fdd13"} Dec 02 23:30:20 crc kubenswrapper[4903]: I1202 23:30:20.227925 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" podStartSLOduration=1.704186091 podStartE2EDuration="2.22785516s" podCreationTimestamp="2025-12-02 23:30:18 +0000 UTC" firstStartedPulling="2025-12-02 23:30:19.296570683 +0000 UTC m=+1958.005124976" lastFinishedPulling="2025-12-02 23:30:19.820239762 +0000 UTC m=+1958.528794045" observedRunningTime="2025-12-02 23:30:20.223491905 +0000 UTC m=+1958.932046218" watchObservedRunningTime="2025-12-02 23:30:20.22785516 +0000 UTC m=+1958.936409483" Dec 02 23:30:28 crc kubenswrapper[4903]: I1202 23:30:28.290173 4903 generic.go:334] "Generic (PLEG): container finished" podID="70d917fc-dbd8-499d-bcae-b5f324de77cb" containerID="54664eb520b45796df686f988dcdb6c4370e05eae70cfaa7c274a33da85ba439" exitCode=0 Dec 02 23:30:28 crc kubenswrapper[4903]: I1202 23:30:28.290315 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" event={"ID":"70d917fc-dbd8-499d-bcae-b5f324de77cb","Type":"ContainerDied","Data":"54664eb520b45796df686f988dcdb6c4370e05eae70cfaa7c274a33da85ba439"} Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.818278 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.929127 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0\") pod \"70d917fc-dbd8-499d-bcae-b5f324de77cb\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.929223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam\") pod \"70d917fc-dbd8-499d-bcae-b5f324de77cb\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.929340 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnw6w\" (UniqueName: \"kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w\") pod \"70d917fc-dbd8-499d-bcae-b5f324de77cb\" (UID: \"70d917fc-dbd8-499d-bcae-b5f324de77cb\") " Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.947857 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w" (OuterVolumeSpecName: "kube-api-access-bnw6w") pod "70d917fc-dbd8-499d-bcae-b5f324de77cb" (UID: "70d917fc-dbd8-499d-bcae-b5f324de77cb"). InnerVolumeSpecName "kube-api-access-bnw6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.965263 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "70d917fc-dbd8-499d-bcae-b5f324de77cb" (UID: "70d917fc-dbd8-499d-bcae-b5f324de77cb"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:29 crc kubenswrapper[4903]: I1202 23:30:29.974993 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "70d917fc-dbd8-499d-bcae-b5f324de77cb" (UID: "70d917fc-dbd8-499d-bcae-b5f324de77cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.032640 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.032703 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnw6w\" (UniqueName: \"kubernetes.io/projected/70d917fc-dbd8-499d-bcae-b5f324de77cb-kube-api-access-bnw6w\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.032716 4903 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/70d917fc-dbd8-499d-bcae-b5f324de77cb-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.323641 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" event={"ID":"70d917fc-dbd8-499d-bcae-b5f324de77cb","Type":"ContainerDied","Data":"3c048b2930fec275d726726d2dd12e4f589d80e7e19484a3a6a7e68cb94fdd13"} Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.323711 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c048b2930fec275d726726d2dd12e4f589d80e7e19484a3a6a7e68cb94fdd13" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.323763 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5s77w" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.390459 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v"] Dec 02 23:30:30 crc kubenswrapper[4903]: E1202 23:30:30.390952 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d917fc-dbd8-499d-bcae-b5f324de77cb" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.390969 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d917fc-dbd8-499d-bcae-b5f324de77cb" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.391151 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d917fc-dbd8-499d-bcae-b5f324de77cb" containerName="ssh-known-hosts-edpm-deployment" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.392092 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.394242 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.394513 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.394826 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.395588 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.400253 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v"] Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.440550 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.440605 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.440817 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznz5\" (UniqueName: \"kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.542976 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.543035 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.543111 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bznz5\" (UniqueName: \"kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.548332 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.560095 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznz5\" (UniqueName: \"kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.560155 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jhx2v\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:30 crc kubenswrapper[4903]: I1202 23:30:30.714165 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:31 crc kubenswrapper[4903]: I1202 23:30:31.325620 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v"] Dec 02 23:30:31 crc kubenswrapper[4903]: W1202 23:30:31.330288 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b801338_6fdb_42ad_b3f8_67b296c04efd.slice/crio-3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec WatchSource:0}: Error finding container 3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec: Status 404 returned error can't find the container with id 3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec Dec 02 23:30:32 crc kubenswrapper[4903]: I1202 23:30:32.345990 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" event={"ID":"0b801338-6fdb-42ad-b3f8-67b296c04efd","Type":"ContainerStarted","Data":"15da5a67893b9925fb60fb552d9aaaa172a8489828910a4828b7f869aabd7f0f"} Dec 02 23:30:32 crc kubenswrapper[4903]: I1202 23:30:32.346311 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" event={"ID":"0b801338-6fdb-42ad-b3f8-67b296c04efd","Type":"ContainerStarted","Data":"3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec"} Dec 02 23:30:32 crc kubenswrapper[4903]: I1202 23:30:32.384392 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" podStartSLOduration=1.837359057 podStartE2EDuration="2.38437045s" podCreationTimestamp="2025-12-02 23:30:30 +0000 UTC" firstStartedPulling="2025-12-02 23:30:31.332944313 +0000 UTC m=+1970.041498636" lastFinishedPulling="2025-12-02 23:30:31.879955746 +0000 UTC m=+1970.588510029" observedRunningTime="2025-12-02 23:30:32.371174951 +0000 UTC m=+1971.079729264" watchObservedRunningTime="2025-12-02 23:30:32.38437045 +0000 UTC m=+1971.092924753" Dec 02 23:30:40 crc kubenswrapper[4903]: I1202 23:30:40.439382 4903 generic.go:334] "Generic (PLEG): container finished" podID="0b801338-6fdb-42ad-b3f8-67b296c04efd" containerID="15da5a67893b9925fb60fb552d9aaaa172a8489828910a4828b7f869aabd7f0f" exitCode=0 Dec 02 23:30:40 crc kubenswrapper[4903]: I1202 23:30:40.439484 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" event={"ID":"0b801338-6fdb-42ad-b3f8-67b296c04efd","Type":"ContainerDied","Data":"15da5a67893b9925fb60fb552d9aaaa172a8489828910a4828b7f869aabd7f0f"} Dec 02 23:30:41 crc kubenswrapper[4903]: I1202 23:30:41.893132 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.053841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bznz5\" (UniqueName: \"kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5\") pod \"0b801338-6fdb-42ad-b3f8-67b296c04efd\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.054130 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key\") pod \"0b801338-6fdb-42ad-b3f8-67b296c04efd\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.054172 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory\") pod \"0b801338-6fdb-42ad-b3f8-67b296c04efd\" (UID: \"0b801338-6fdb-42ad-b3f8-67b296c04efd\") " Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.059592 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5" (OuterVolumeSpecName: "kube-api-access-bznz5") pod "0b801338-6fdb-42ad-b3f8-67b296c04efd" (UID: "0b801338-6fdb-42ad-b3f8-67b296c04efd"). InnerVolumeSpecName "kube-api-access-bznz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.091072 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory" (OuterVolumeSpecName: "inventory") pod "0b801338-6fdb-42ad-b3f8-67b296c04efd" (UID: "0b801338-6fdb-42ad-b3f8-67b296c04efd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.091534 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b801338-6fdb-42ad-b3f8-67b296c04efd" (UID: "0b801338-6fdb-42ad-b3f8-67b296c04efd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.157229 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bznz5\" (UniqueName: \"kubernetes.io/projected/0b801338-6fdb-42ad-b3f8-67b296c04efd-kube-api-access-bznz5\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.157274 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.157287 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b801338-6fdb-42ad-b3f8-67b296c04efd-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.460101 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" event={"ID":"0b801338-6fdb-42ad-b3f8-67b296c04efd","Type":"ContainerDied","Data":"3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec"} Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.460150 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a9cc4524c9eb97eab7c1b562bf916af5782013848aca701d12b0f38833006ec" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.460164 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jhx2v" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.532924 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8"] Dec 02 23:30:42 crc kubenswrapper[4903]: E1202 23:30:42.533603 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b801338-6fdb-42ad-b3f8-67b296c04efd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.533633 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b801338-6fdb-42ad-b3f8-67b296c04efd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.534008 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b801338-6fdb-42ad-b3f8-67b296c04efd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.535084 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.538172 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.538551 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.538729 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.539131 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.543017 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8"] Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.567318 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.567409 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.567483 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9bb\" (UniqueName: \"kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.670383 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.670468 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.670512 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9bb\" (UniqueName: \"kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.683705 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.685483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.698351 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9bb\" (UniqueName: \"kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:42 crc kubenswrapper[4903]: I1202 23:30:42.856125 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:43 crc kubenswrapper[4903]: I1202 23:30:43.435762 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8"] Dec 02 23:30:43 crc kubenswrapper[4903]: W1202 23:30:43.440958 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88d8ef39_c7d5_45d4_bd56_fbb4a23d0678.slice/crio-c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7 WatchSource:0}: Error finding container c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7: Status 404 returned error can't find the container with id c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7 Dec 02 23:30:43 crc kubenswrapper[4903]: I1202 23:30:43.474164 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" event={"ID":"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678","Type":"ContainerStarted","Data":"c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7"} Dec 02 23:30:44 crc kubenswrapper[4903]: I1202 23:30:44.485975 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" event={"ID":"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678","Type":"ContainerStarted","Data":"701366bf9a56f66b07c255635919ca51869cf52a5da4a11b220e96c93eb92f2a"} Dec 02 23:30:44 crc kubenswrapper[4903]: I1202 23:30:44.519331 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" podStartSLOduration=1.994574564 podStartE2EDuration="2.519303509s" podCreationTimestamp="2025-12-02 23:30:42 +0000 UTC" firstStartedPulling="2025-12-02 23:30:43.444061337 +0000 UTC m=+1982.152615660" lastFinishedPulling="2025-12-02 23:30:43.968790282 +0000 UTC m=+1982.677344605" observedRunningTime="2025-12-02 23:30:44.508461367 +0000 UTC m=+1983.217015690" watchObservedRunningTime="2025-12-02 23:30:44.519303509 +0000 UTC m=+1983.227857832" Dec 02 23:30:53 crc kubenswrapper[4903]: I1202 23:30:53.069504 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:30:53 crc kubenswrapper[4903]: I1202 23:30:53.070262 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:30:54 crc kubenswrapper[4903]: I1202 23:30:54.579596 4903 generic.go:334] "Generic (PLEG): container finished" podID="88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" containerID="701366bf9a56f66b07c255635919ca51869cf52a5da4a11b220e96c93eb92f2a" exitCode=0 Dec 02 23:30:54 crc kubenswrapper[4903]: I1202 23:30:54.579707 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" event={"ID":"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678","Type":"ContainerDied","Data":"701366bf9a56f66b07c255635919ca51869cf52a5da4a11b220e96c93eb92f2a"} Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.159148 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.191247 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9bb\" (UniqueName: \"kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb\") pod \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.191305 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key\") pod \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.191455 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory\") pod \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\" (UID: \"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678\") " Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.201058 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb" (OuterVolumeSpecName: "kube-api-access-wl9bb") pod "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" (UID: "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678"). InnerVolumeSpecName "kube-api-access-wl9bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.243739 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory" (OuterVolumeSpecName: "inventory") pod "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" (UID: "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.253624 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" (UID: "88d8ef39-c7d5-45d4-bd56-fbb4a23d0678"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.292958 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.293008 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9bb\" (UniqueName: \"kubernetes.io/projected/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-kube-api-access-wl9bb\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.293019 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d8ef39-c7d5-45d4-bd56-fbb4a23d0678-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.602530 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" event={"ID":"88d8ef39-c7d5-45d4-bd56-fbb4a23d0678","Type":"ContainerDied","Data":"c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7"} Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.602580 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c52568d017ece5976835e8c3b12636652ad9495e94a724065ad40428a1a554e7" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.602637 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.723152 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895"] Dec 02 23:30:56 crc kubenswrapper[4903]: E1202 23:30:56.723528 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.723545 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.723773 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d8ef39-c7d5-45d4-bd56-fbb4a23d0678" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.724390 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.726517 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.726896 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.727342 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.727385 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.727558 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.727578 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.727435 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.729160 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.751444 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895"] Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802378 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802447 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802508 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802629 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802670 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802692 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802729 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802766 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802808 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802829 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802892 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.802981 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.803023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.803059 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlpcj\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.904722 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.904810 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.904859 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlpcj\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.904953 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.904989 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905028 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905133 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905174 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905211 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905260 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905310 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905349 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905394 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.905472 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.911761 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.911973 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.912483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.913094 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.913563 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.914410 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.916106 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.917018 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.917624 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.918216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.918275 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.918395 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.924161 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:56 crc kubenswrapper[4903]: I1202 23:30:56.933898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlpcj\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pc895\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:57 crc kubenswrapper[4903]: I1202 23:30:57.048052 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:30:57 crc kubenswrapper[4903]: I1202 23:30:57.665154 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895"] Dec 02 23:30:58 crc kubenswrapper[4903]: I1202 23:30:58.622779 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" event={"ID":"b8a8af95-c502-4b50-a90e-682b039c6e58","Type":"ContainerStarted","Data":"80425a5f1191ba178edf245717db1d810053e2f1294e072289b35eca288b7452"} Dec 02 23:30:58 crc kubenswrapper[4903]: I1202 23:30:58.623157 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" event={"ID":"b8a8af95-c502-4b50-a90e-682b039c6e58","Type":"ContainerStarted","Data":"f92118e2ef1b766a3c972b5e57c6254514a773a43015a8115ac2bcc588da55b8"} Dec 02 23:30:58 crc kubenswrapper[4903]: I1202 23:30:58.669409 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" podStartSLOduration=2.1034105849999998 podStartE2EDuration="2.669373495s" podCreationTimestamp="2025-12-02 23:30:56 +0000 UTC" firstStartedPulling="2025-12-02 23:30:57.672072754 +0000 UTC m=+1996.380627047" lastFinishedPulling="2025-12-02 23:30:58.238035664 +0000 UTC m=+1996.946589957" observedRunningTime="2025-12-02 23:30:58.649906625 +0000 UTC m=+1997.358460948" watchObservedRunningTime="2025-12-02 23:30:58.669373495 +0000 UTC m=+1997.377927828" Dec 02 23:31:04 crc kubenswrapper[4903]: I1202 23:31:04.222325 4903 scope.go:117] "RemoveContainer" containerID="a8eef5c453b8d96bd3fb8a2612d283b49ee55cd45a6259efe3efae85c2dc3ab4" Dec 02 23:31:23 crc kubenswrapper[4903]: I1202 23:31:23.069989 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:31:23 crc kubenswrapper[4903]: I1202 23:31:23.070726 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:31:41 crc kubenswrapper[4903]: I1202 23:31:41.081581 4903 generic.go:334] "Generic (PLEG): container finished" podID="b8a8af95-c502-4b50-a90e-682b039c6e58" containerID="80425a5f1191ba178edf245717db1d810053e2f1294e072289b35eca288b7452" exitCode=0 Dec 02 23:31:41 crc kubenswrapper[4903]: I1202 23:31:41.081710 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" event={"ID":"b8a8af95-c502-4b50-a90e-682b039c6e58","Type":"ContainerDied","Data":"80425a5f1191ba178edf245717db1d810053e2f1294e072289b35eca288b7452"} Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.559775 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589636 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589693 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlpcj\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589726 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589760 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589779 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589798 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589821 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589854 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589884 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589923 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.589961 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.590007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.590152 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.590188 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle\") pod \"b8a8af95-c502-4b50-a90e-682b039c6e58\" (UID: \"b8a8af95-c502-4b50-a90e-682b039c6e58\") " Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611122 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611142 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611241 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611273 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611324 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611341 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611790 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj" (OuterVolumeSpecName: "kube-api-access-vlpcj") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "kube-api-access-vlpcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611899 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611841 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.611970 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.617855 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.620966 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.628147 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory" (OuterVolumeSpecName: "inventory") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.634212 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8a8af95-c502-4b50-a90e-682b039c6e58" (UID: "b8a8af95-c502-4b50-a90e-682b039c6e58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.693864 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.693918 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlpcj\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-kube-api-access-vlpcj\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.693971 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.693991 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694011 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694031 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694050 4903 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694197 4903 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694219 4903 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694237 4903 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694255 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694271 4903 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694288 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b8a8af95-c502-4b50-a90e-682b039c6e58-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:42 crc kubenswrapper[4903]: I1202 23:31:42.694305 4903 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a8af95-c502-4b50-a90e-682b039c6e58-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.121465 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" event={"ID":"b8a8af95-c502-4b50-a90e-682b039c6e58","Type":"ContainerDied","Data":"f92118e2ef1b766a3c972b5e57c6254514a773a43015a8115ac2bcc588da55b8"} Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.121508 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92118e2ef1b766a3c972b5e57c6254514a773a43015a8115ac2bcc588da55b8" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.121549 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pc895" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.247302 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm"] Dec 02 23:31:43 crc kubenswrapper[4903]: E1202 23:31:43.247782 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a8af95-c502-4b50-a90e-682b039c6e58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.247806 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a8af95-c502-4b50-a90e-682b039c6e58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.248092 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a8af95-c502-4b50-a90e-682b039c6e58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.249014 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.253535 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.253569 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.255222 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.255401 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.257413 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.273140 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm"] Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.306426 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.306497 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.306608 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.306751 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlfw\" (UniqueName: \"kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.306791 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.409145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlfw\" (UniqueName: \"kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.409423 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.409590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.409629 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.409685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.410717 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.415059 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.415110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.415116 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.431424 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlfw\" (UniqueName: \"kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hbltm\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:43 crc kubenswrapper[4903]: I1202 23:31:43.582018 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:31:44 crc kubenswrapper[4903]: I1202 23:31:44.199174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm"] Dec 02 23:31:45 crc kubenswrapper[4903]: I1202 23:31:45.145547 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" event={"ID":"08be3078-8019-4472-8260-d24032d74b39","Type":"ContainerStarted","Data":"380ded192e0c91ae524b9589e4d8f97581af69d52a2c963f3d3e0fb0ce7fee30"} Dec 02 23:31:45 crc kubenswrapper[4903]: I1202 23:31:45.145914 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" event={"ID":"08be3078-8019-4472-8260-d24032d74b39","Type":"ContainerStarted","Data":"51903cef337d4fe153892001af30925afcd7f33cc50645dd367c3b1577efd06e"} Dec 02 23:31:45 crc kubenswrapper[4903]: I1202 23:31:45.184004 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" podStartSLOduration=1.679248858 podStartE2EDuration="2.183981158s" podCreationTimestamp="2025-12-02 23:31:43 +0000 UTC" firstStartedPulling="2025-12-02 23:31:44.211689566 +0000 UTC m=+2042.920243849" lastFinishedPulling="2025-12-02 23:31:44.716421836 +0000 UTC m=+2043.424976149" observedRunningTime="2025-12-02 23:31:45.170222255 +0000 UTC m=+2043.878776548" watchObservedRunningTime="2025-12-02 23:31:45.183981158 +0000 UTC m=+2043.892535451" Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.069532 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.070376 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.070425 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.071307 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.071354 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a" gracePeriod=600 Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.249500 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a" exitCode=0 Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.249579 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a"} Dec 02 23:31:53 crc kubenswrapper[4903]: I1202 23:31:53.249994 4903 scope.go:117] "RemoveContainer" containerID="eea089a1153012030d0c185f960d3bfe8298bb2908765149f2079ec873b323eb" Dec 02 23:31:54 crc kubenswrapper[4903]: I1202 23:31:54.261322 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d"} Dec 02 23:32:59 crc kubenswrapper[4903]: I1202 23:32:59.023626 4903 generic.go:334] "Generic (PLEG): container finished" podID="08be3078-8019-4472-8260-d24032d74b39" containerID="380ded192e0c91ae524b9589e4d8f97581af69d52a2c963f3d3e0fb0ce7fee30" exitCode=0 Dec 02 23:32:59 crc kubenswrapper[4903]: I1202 23:32:59.023715 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" event={"ID":"08be3078-8019-4472-8260-d24032d74b39","Type":"ContainerDied","Data":"380ded192e0c91ae524b9589e4d8f97581af69d52a2c963f3d3e0fb0ce7fee30"} Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.583391 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.707204 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key\") pod \"08be3078-8019-4472-8260-d24032d74b39\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.707610 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzlfw\" (UniqueName: \"kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw\") pod \"08be3078-8019-4472-8260-d24032d74b39\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.707829 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle\") pod \"08be3078-8019-4472-8260-d24032d74b39\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.708271 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0\") pod \"08be3078-8019-4472-8260-d24032d74b39\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.708502 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory\") pod \"08be3078-8019-4472-8260-d24032d74b39\" (UID: \"08be3078-8019-4472-8260-d24032d74b39\") " Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.713601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "08be3078-8019-4472-8260-d24032d74b39" (UID: "08be3078-8019-4472-8260-d24032d74b39"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.714143 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw" (OuterVolumeSpecName: "kube-api-access-xzlfw") pod "08be3078-8019-4472-8260-d24032d74b39" (UID: "08be3078-8019-4472-8260-d24032d74b39"). InnerVolumeSpecName "kube-api-access-xzlfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.734173 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "08be3078-8019-4472-8260-d24032d74b39" (UID: "08be3078-8019-4472-8260-d24032d74b39"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.739572 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory" (OuterVolumeSpecName: "inventory") pod "08be3078-8019-4472-8260-d24032d74b39" (UID: "08be3078-8019-4472-8260-d24032d74b39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.741150 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "08be3078-8019-4472-8260-d24032d74b39" (UID: "08be3078-8019-4472-8260-d24032d74b39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.812133 4903 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/08be3078-8019-4472-8260-d24032d74b39-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.812190 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.812208 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.812225 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzlfw\" (UniqueName: \"kubernetes.io/projected/08be3078-8019-4472-8260-d24032d74b39-kube-api-access-xzlfw\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:00 crc kubenswrapper[4903]: I1202 23:33:00.812301 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08be3078-8019-4472-8260-d24032d74b39-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.050634 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" event={"ID":"08be3078-8019-4472-8260-d24032d74b39","Type":"ContainerDied","Data":"51903cef337d4fe153892001af30925afcd7f33cc50645dd367c3b1577efd06e"} Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.051239 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51903cef337d4fe153892001af30925afcd7f33cc50645dd367c3b1577efd06e" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.050750 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hbltm" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.346605 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz"] Dec 02 23:33:01 crc kubenswrapper[4903]: E1202 23:33:01.347065 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08be3078-8019-4472-8260-d24032d74b39" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.347086 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="08be3078-8019-4472-8260-d24032d74b39" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.347378 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="08be3078-8019-4472-8260-d24032d74b39" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.348131 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.351868 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.352520 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.352746 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.357600 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.357807 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.357881 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.390998 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz"] Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.426222 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.426744 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.426801 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.427165 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.427625 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z78d\" (UniqueName: \"kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.427785 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530051 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530150 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530200 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530320 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530474 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z78d\" (UniqueName: \"kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.530519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.536098 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.537127 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.537216 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.537855 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.538965 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.559912 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z78d\" (UniqueName: \"kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:01 crc kubenswrapper[4903]: I1202 23:33:01.685929 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:33:02 crc kubenswrapper[4903]: I1202 23:33:02.230571 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz"] Dec 02 23:33:03 crc kubenswrapper[4903]: I1202 23:33:03.070544 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" event={"ID":"c0b03ee1-07d8-4d8e-b047-480a4dd369f0","Type":"ContainerStarted","Data":"84edd3dd2baa8a9f392dd5999bcbf97f405bf87181725ff011cec3dc327a78ca"} Dec 02 23:33:03 crc kubenswrapper[4903]: I1202 23:33:03.071468 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" event={"ID":"c0b03ee1-07d8-4d8e-b047-480a4dd369f0","Type":"ContainerStarted","Data":"44dac1e8be5e8da21268042a660fadd29cd9d9277222dfe4420a67b4ea62e38b"} Dec 02 23:33:03 crc kubenswrapper[4903]: I1202 23:33:03.098715 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" podStartSLOduration=1.654307632 podStartE2EDuration="2.098692504s" podCreationTimestamp="2025-12-02 23:33:01 +0000 UTC" firstStartedPulling="2025-12-02 23:33:02.240342844 +0000 UTC m=+2120.948897127" lastFinishedPulling="2025-12-02 23:33:02.684727706 +0000 UTC m=+2121.393281999" observedRunningTime="2025-12-02 23:33:03.090536067 +0000 UTC m=+2121.799090340" watchObservedRunningTime="2025-12-02 23:33:03.098692504 +0000 UTC m=+2121.807246787" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.459193 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.464383 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.477694 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.615594 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.615758 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b5xt\" (UniqueName: \"kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.616000 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.717830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b5xt\" (UniqueName: \"kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.717968 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.718094 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.718516 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.718625 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.744280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b5xt\" (UniqueName: \"kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt\") pod \"redhat-marketplace-qgtnp\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:42 crc kubenswrapper[4903]: I1202 23:33:42.791420 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:43 crc kubenswrapper[4903]: I1202 23:33:43.319532 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:43 crc kubenswrapper[4903]: I1202 23:33:43.546556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerStarted","Data":"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593"} Dec 02 23:33:43 crc kubenswrapper[4903]: I1202 23:33:43.547018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerStarted","Data":"2676857a5c8ef1f9b8a8e92e8660276a12d273ca0d0f1f4c92c5cf726197d0b2"} Dec 02 23:33:44 crc kubenswrapper[4903]: I1202 23:33:44.561585 4903 generic.go:334] "Generic (PLEG): container finished" podID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerID="c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593" exitCode=0 Dec 02 23:33:44 crc kubenswrapper[4903]: I1202 23:33:44.561717 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerDied","Data":"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593"} Dec 02 23:33:45 crc kubenswrapper[4903]: I1202 23:33:45.572686 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerStarted","Data":"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70"} Dec 02 23:33:46 crc kubenswrapper[4903]: I1202 23:33:46.586984 4903 generic.go:334] "Generic (PLEG): container finished" podID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerID="efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70" exitCode=0 Dec 02 23:33:46 crc kubenswrapper[4903]: I1202 23:33:46.587033 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerDied","Data":"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70"} Dec 02 23:33:47 crc kubenswrapper[4903]: I1202 23:33:47.601037 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerStarted","Data":"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42"} Dec 02 23:33:47 crc kubenswrapper[4903]: I1202 23:33:47.640558 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qgtnp" podStartSLOduration=3.185373866 podStartE2EDuration="5.640532955s" podCreationTimestamp="2025-12-02 23:33:42 +0000 UTC" firstStartedPulling="2025-12-02 23:33:44.565085277 +0000 UTC m=+2163.273639600" lastFinishedPulling="2025-12-02 23:33:47.020244406 +0000 UTC m=+2165.728798689" observedRunningTime="2025-12-02 23:33:47.630592645 +0000 UTC m=+2166.339146928" watchObservedRunningTime="2025-12-02 23:33:47.640532955 +0000 UTC m=+2166.349087248" Dec 02 23:33:52 crc kubenswrapper[4903]: I1202 23:33:52.792298 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:52 crc kubenswrapper[4903]: I1202 23:33:52.794334 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:52 crc kubenswrapper[4903]: I1202 23:33:52.846583 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:53 crc kubenswrapper[4903]: I1202 23:33:53.070428 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:33:53 crc kubenswrapper[4903]: I1202 23:33:53.070491 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:33:53 crc kubenswrapper[4903]: I1202 23:33:53.727592 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:53 crc kubenswrapper[4903]: I1202 23:33:53.791014 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:55 crc kubenswrapper[4903]: I1202 23:33:55.687596 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qgtnp" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="registry-server" containerID="cri-o://c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42" gracePeriod=2 Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.182815 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.362838 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b5xt\" (UniqueName: \"kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt\") pod \"c76c30b8-242b-41a2-8c02-4c8810759b19\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.363735 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities\") pod \"c76c30b8-242b-41a2-8c02-4c8810759b19\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.363794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content\") pod \"c76c30b8-242b-41a2-8c02-4c8810759b19\" (UID: \"c76c30b8-242b-41a2-8c02-4c8810759b19\") " Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.365424 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities" (OuterVolumeSpecName: "utilities") pod "c76c30b8-242b-41a2-8c02-4c8810759b19" (UID: "c76c30b8-242b-41a2-8c02-4c8810759b19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.369929 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt" (OuterVolumeSpecName: "kube-api-access-5b5xt") pod "c76c30b8-242b-41a2-8c02-4c8810759b19" (UID: "c76c30b8-242b-41a2-8c02-4c8810759b19"). InnerVolumeSpecName "kube-api-access-5b5xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.383972 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c76c30b8-242b-41a2-8c02-4c8810759b19" (UID: "c76c30b8-242b-41a2-8c02-4c8810759b19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.468023 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.468107 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76c30b8-242b-41a2-8c02-4c8810759b19-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.468138 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b5xt\" (UniqueName: \"kubernetes.io/projected/c76c30b8-242b-41a2-8c02-4c8810759b19-kube-api-access-5b5xt\") on node \"crc\" DevicePath \"\"" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.702766 4903 generic.go:334] "Generic (PLEG): container finished" podID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerID="c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42" exitCode=0 Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.702826 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerDied","Data":"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42"} Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.702866 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgtnp" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.702893 4903 scope.go:117] "RemoveContainer" containerID="c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.702876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgtnp" event={"ID":"c76c30b8-242b-41a2-8c02-4c8810759b19","Type":"ContainerDied","Data":"2676857a5c8ef1f9b8a8e92e8660276a12d273ca0d0f1f4c92c5cf726197d0b2"} Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.734576 4903 scope.go:117] "RemoveContainer" containerID="efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.759555 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.773095 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgtnp"] Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.782426 4903 scope.go:117] "RemoveContainer" containerID="c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.843621 4903 scope.go:117] "RemoveContainer" containerID="c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42" Dec 02 23:33:56 crc kubenswrapper[4903]: E1202 23:33:56.845231 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42\": container with ID starting with c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42 not found: ID does not exist" containerID="c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.845283 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42"} err="failed to get container status \"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42\": rpc error: code = NotFound desc = could not find container \"c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42\": container with ID starting with c36dad823e2faa57db4684efc10c08b6be53f2ff75d7a3775215c2131fe49e42 not found: ID does not exist" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.845311 4903 scope.go:117] "RemoveContainer" containerID="efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70" Dec 02 23:33:56 crc kubenswrapper[4903]: E1202 23:33:56.845985 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70\": container with ID starting with efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70 not found: ID does not exist" containerID="efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.846022 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70"} err="failed to get container status \"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70\": rpc error: code = NotFound desc = could not find container \"efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70\": container with ID starting with efccfc68853940b26a10e60efd12b773d02537052144d743224d9848c7e9df70 not found: ID does not exist" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.846054 4903 scope.go:117] "RemoveContainer" containerID="c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593" Dec 02 23:33:56 crc kubenswrapper[4903]: E1202 23:33:56.846552 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593\": container with ID starting with c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593 not found: ID does not exist" containerID="c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593" Dec 02 23:33:56 crc kubenswrapper[4903]: I1202 23:33:56.846585 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593"} err="failed to get container status \"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593\": rpc error: code = NotFound desc = could not find container \"c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593\": container with ID starting with c621ce138a40c7dd36f9b2946d42463ad89a167550d4207710c904f571f1f593 not found: ID does not exist" Dec 02 23:33:57 crc kubenswrapper[4903]: I1202 23:33:57.629199 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" path="/var/lib/kubelet/pods/c76c30b8-242b-41a2-8c02-4c8810759b19/volumes" Dec 02 23:34:00 crc kubenswrapper[4903]: I1202 23:34:00.759574 4903 generic.go:334] "Generic (PLEG): container finished" podID="c0b03ee1-07d8-4d8e-b047-480a4dd369f0" containerID="84edd3dd2baa8a9f392dd5999bcbf97f405bf87181725ff011cec3dc327a78ca" exitCode=0 Dec 02 23:34:00 crc kubenswrapper[4903]: I1202 23:34:00.759647 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" event={"ID":"c0b03ee1-07d8-4d8e-b047-480a4dd369f0","Type":"ContainerDied","Data":"84edd3dd2baa8a9f392dd5999bcbf97f405bf87181725ff011cec3dc327a78ca"} Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.294026 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.402616 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.402675 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.402713 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.402811 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.402918 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z78d\" (UniqueName: \"kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.403034 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0\") pod \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\" (UID: \"c0b03ee1-07d8-4d8e-b047-480a4dd369f0\") " Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.410620 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.410641 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d" (OuterVolumeSpecName: "kube-api-access-6z78d") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "kube-api-access-6z78d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.432438 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.432828 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.438413 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.443567 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory" (OuterVolumeSpecName: "inventory") pod "c0b03ee1-07d8-4d8e-b047-480a4dd369f0" (UID: "c0b03ee1-07d8-4d8e-b047-480a4dd369f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.505982 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z78d\" (UniqueName: \"kubernetes.io/projected/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-kube-api-access-6z78d\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.506021 4903 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.506036 4903 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.506050 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.506064 4903 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.506076 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0b03ee1-07d8-4d8e-b047-480a4dd369f0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.788244 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" event={"ID":"c0b03ee1-07d8-4d8e-b047-480a4dd369f0","Type":"ContainerDied","Data":"44dac1e8be5e8da21268042a660fadd29cd9d9277222dfe4420a67b4ea62e38b"} Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.788301 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.788309 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44dac1e8be5e8da21268042a660fadd29cd9d9277222dfe4420a67b4ea62e38b" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917164 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st"] Dec 02 23:34:02 crc kubenswrapper[4903]: E1202 23:34:02.917625 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="extract-utilities" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917647 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="extract-utilities" Dec 02 23:34:02 crc kubenswrapper[4903]: E1202 23:34:02.917689 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="registry-server" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917697 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="registry-server" Dec 02 23:34:02 crc kubenswrapper[4903]: E1202 23:34:02.917710 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b03ee1-07d8-4d8e-b047-480a4dd369f0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917720 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b03ee1-07d8-4d8e-b047-480a4dd369f0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:34:02 crc kubenswrapper[4903]: E1202 23:34:02.917729 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="extract-content" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917737 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="extract-content" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.917995 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b03ee1-07d8-4d8e-b047-480a4dd369f0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.918024 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76c30b8-242b-41a2-8c02-4c8810759b19" containerName="registry-server" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.918826 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.927244 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.927295 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.927836 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.927944 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.928063 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:34:02 crc kubenswrapper[4903]: I1202 23:34:02.937506 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st"] Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.025184 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.025752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.025876 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.025939 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzr2\" (UniqueName: \"kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.026104 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.127951 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.128121 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.128163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzr2\" (UniqueName: \"kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.128265 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.128339 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.136679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.137921 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.138353 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.138374 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.154544 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzr2\" (UniqueName: \"kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6s9st\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.250684 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:34:03 crc kubenswrapper[4903]: I1202 23:34:03.868879 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st"] Dec 02 23:34:04 crc kubenswrapper[4903]: I1202 23:34:04.811964 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" event={"ID":"c9427e93-561b-4f09-bcec-00c7001f2541","Type":"ContainerStarted","Data":"d4d7d2ff6e578e5ce57ac3a2a0e159de501fd4812fad7926b4dd2dc8026a55ea"} Dec 02 23:34:04 crc kubenswrapper[4903]: I1202 23:34:04.812264 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" event={"ID":"c9427e93-561b-4f09-bcec-00c7001f2541","Type":"ContainerStarted","Data":"77e9898d24bc6bb787f06b4a536c1dbb19b66e8d8cd0692b570936a60d17626d"} Dec 02 23:34:04 crc kubenswrapper[4903]: I1202 23:34:04.839962 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" podStartSLOduration=2.2986621769999998 podStartE2EDuration="2.839943595s" podCreationTimestamp="2025-12-02 23:34:02 +0000 UTC" firstStartedPulling="2025-12-02 23:34:03.869772945 +0000 UTC m=+2182.578327268" lastFinishedPulling="2025-12-02 23:34:04.411054373 +0000 UTC m=+2183.119608686" observedRunningTime="2025-12-02 23:34:04.83353846 +0000 UTC m=+2183.542092743" watchObservedRunningTime="2025-12-02 23:34:04.839943595 +0000 UTC m=+2183.548497878" Dec 02 23:34:13 crc kubenswrapper[4903]: I1202 23:34:13.909990 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:13 crc kubenswrapper[4903]: I1202 23:34:13.912839 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:13 crc kubenswrapper[4903]: I1202 23:34:13.942320 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.083306 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.083413 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppj8n\" (UniqueName: \"kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.083480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.185232 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppj8n\" (UniqueName: \"kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.185342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.185477 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.185987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.186051 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.208554 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppj8n\" (UniqueName: \"kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n\") pod \"redhat-operators-sb79n\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.258603 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.741845 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:14 crc kubenswrapper[4903]: I1202 23:34:14.931265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerStarted","Data":"770a20124c4dd23d2355ee5a963e93b077f519e61a2783858d99ef025e751247"} Dec 02 23:34:15 crc kubenswrapper[4903]: I1202 23:34:15.946302 4903 generic.go:334] "Generic (PLEG): container finished" podID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerID="f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3" exitCode=0 Dec 02 23:34:15 crc kubenswrapper[4903]: I1202 23:34:15.946414 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerDied","Data":"f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3"} Dec 02 23:34:15 crc kubenswrapper[4903]: I1202 23:34:15.950431 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.485881 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.490117 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.501346 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.642097 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgkrg\" (UniqueName: \"kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.642263 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.642433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.745064 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.745239 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.745343 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgkrg\" (UniqueName: \"kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.747190 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.747408 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.782087 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgkrg\" (UniqueName: \"kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg\") pod \"community-operators-vs25k\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.819082 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:16 crc kubenswrapper[4903]: I1202 23:34:16.967409 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerStarted","Data":"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79"} Dec 02 23:34:17 crc kubenswrapper[4903]: I1202 23:34:17.370365 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:17 crc kubenswrapper[4903]: W1202 23:34:17.376307 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe235e33_fdd1_46a4_aa9e_be70794d9e89.slice/crio-9ba5253b25167ab10d699feb5341ca972f093d71cf9d0abe01535cb64ac6beae WatchSource:0}: Error finding container 9ba5253b25167ab10d699feb5341ca972f093d71cf9d0abe01535cb64ac6beae: Status 404 returned error can't find the container with id 9ba5253b25167ab10d699feb5341ca972f093d71cf9d0abe01535cb64ac6beae Dec 02 23:34:17 crc kubenswrapper[4903]: I1202 23:34:17.979793 4903 generic.go:334] "Generic (PLEG): container finished" podID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerID="0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e" exitCode=0 Dec 02 23:34:17 crc kubenswrapper[4903]: I1202 23:34:17.979910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerDied","Data":"0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e"} Dec 02 23:34:17 crc kubenswrapper[4903]: I1202 23:34:17.980235 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerStarted","Data":"9ba5253b25167ab10d699feb5341ca972f093d71cf9d0abe01535cb64ac6beae"} Dec 02 23:34:20 crc kubenswrapper[4903]: I1202 23:34:20.011263 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerStarted","Data":"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9"} Dec 02 23:34:20 crc kubenswrapper[4903]: E1202 23:34:20.426521 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod927ef157_9d8d_444e_ad6f_d24d853beb7f.slice/crio-f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79.scope\": RecentStats: unable to find data in memory cache]" Dec 02 23:34:21 crc kubenswrapper[4903]: I1202 23:34:21.026312 4903 generic.go:334] "Generic (PLEG): container finished" podID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerID="a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9" exitCode=0 Dec 02 23:34:21 crc kubenswrapper[4903]: I1202 23:34:21.026377 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerDied","Data":"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9"} Dec 02 23:34:21 crc kubenswrapper[4903]: I1202 23:34:21.029241 4903 generic.go:334] "Generic (PLEG): container finished" podID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerID="f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79" exitCode=0 Dec 02 23:34:21 crc kubenswrapper[4903]: I1202 23:34:21.029308 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerDied","Data":"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79"} Dec 02 23:34:22 crc kubenswrapper[4903]: I1202 23:34:22.043582 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerStarted","Data":"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f"} Dec 02 23:34:22 crc kubenswrapper[4903]: I1202 23:34:22.048950 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerStarted","Data":"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae"} Dec 02 23:34:22 crc kubenswrapper[4903]: I1202 23:34:22.066799 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vs25k" podStartSLOduration=2.360879773 podStartE2EDuration="6.066782208s" podCreationTimestamp="2025-12-02 23:34:16 +0000 UTC" firstStartedPulling="2025-12-02 23:34:17.982023143 +0000 UTC m=+2196.690577426" lastFinishedPulling="2025-12-02 23:34:21.687925568 +0000 UTC m=+2200.396479861" observedRunningTime="2025-12-02 23:34:22.062051184 +0000 UTC m=+2200.770605467" watchObservedRunningTime="2025-12-02 23:34:22.066782208 +0000 UTC m=+2200.775336491" Dec 02 23:34:22 crc kubenswrapper[4903]: I1202 23:34:22.080170 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sb79n" podStartSLOduration=3.394483825 podStartE2EDuration="9.080150642s" podCreationTimestamp="2025-12-02 23:34:13 +0000 UTC" firstStartedPulling="2025-12-02 23:34:15.950097688 +0000 UTC m=+2194.658651981" lastFinishedPulling="2025-12-02 23:34:21.635764495 +0000 UTC m=+2200.344318798" observedRunningTime="2025-12-02 23:34:22.077380035 +0000 UTC m=+2200.785934318" watchObservedRunningTime="2025-12-02 23:34:22.080150642 +0000 UTC m=+2200.788704925" Dec 02 23:34:23 crc kubenswrapper[4903]: I1202 23:34:23.070135 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:34:23 crc kubenswrapper[4903]: I1202 23:34:23.070415 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:34:24 crc kubenswrapper[4903]: I1202 23:34:24.259867 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:24 crc kubenswrapper[4903]: I1202 23:34:24.259952 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:25 crc kubenswrapper[4903]: I1202 23:34:25.327019 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sb79n" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="registry-server" probeResult="failure" output=< Dec 02 23:34:25 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:34:25 crc kubenswrapper[4903]: > Dec 02 23:34:26 crc kubenswrapper[4903]: I1202 23:34:26.820569 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:26 crc kubenswrapper[4903]: I1202 23:34:26.820789 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:26 crc kubenswrapper[4903]: I1202 23:34:26.887512 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:27 crc kubenswrapper[4903]: I1202 23:34:27.191997 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:27 crc kubenswrapper[4903]: I1202 23:34:27.463612 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.149021 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vs25k" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="registry-server" containerID="cri-o://3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f" gracePeriod=2 Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.605839 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.742090 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content\") pod \"be235e33-fdd1-46a4-aa9e-be70794d9e89\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.742326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgkrg\" (UniqueName: \"kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg\") pod \"be235e33-fdd1-46a4-aa9e-be70794d9e89\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.742362 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities\") pod \"be235e33-fdd1-46a4-aa9e-be70794d9e89\" (UID: \"be235e33-fdd1-46a4-aa9e-be70794d9e89\") " Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.743260 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities" (OuterVolumeSpecName: "utilities") pod "be235e33-fdd1-46a4-aa9e-be70794d9e89" (UID: "be235e33-fdd1-46a4-aa9e-be70794d9e89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.745024 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.750929 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg" (OuterVolumeSpecName: "kube-api-access-dgkrg") pod "be235e33-fdd1-46a4-aa9e-be70794d9e89" (UID: "be235e33-fdd1-46a4-aa9e-be70794d9e89"). InnerVolumeSpecName "kube-api-access-dgkrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.804298 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be235e33-fdd1-46a4-aa9e-be70794d9e89" (UID: "be235e33-fdd1-46a4-aa9e-be70794d9e89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.846357 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be235e33-fdd1-46a4-aa9e-be70794d9e89-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:29 crc kubenswrapper[4903]: I1202 23:34:29.846397 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgkrg\" (UniqueName: \"kubernetes.io/projected/be235e33-fdd1-46a4-aa9e-be70794d9e89-kube-api-access-dgkrg\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.161644 4903 generic.go:334] "Generic (PLEG): container finished" podID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerID="3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f" exitCode=0 Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.161745 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerDied","Data":"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f"} Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.161787 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs25k" event={"ID":"be235e33-fdd1-46a4-aa9e-be70794d9e89","Type":"ContainerDied","Data":"9ba5253b25167ab10d699feb5341ca972f093d71cf9d0abe01535cb64ac6beae"} Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.161794 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs25k" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.161812 4903 scope.go:117] "RemoveContainer" containerID="3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.188709 4903 scope.go:117] "RemoveContainer" containerID="a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.214598 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.225474 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vs25k"] Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.233033 4903 scope.go:117] "RemoveContainer" containerID="0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.276526 4903 scope.go:117] "RemoveContainer" containerID="3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f" Dec 02 23:34:30 crc kubenswrapper[4903]: E1202 23:34:30.277089 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f\": container with ID starting with 3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f not found: ID does not exist" containerID="3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.277130 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f"} err="failed to get container status \"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f\": rpc error: code = NotFound desc = could not find container \"3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f\": container with ID starting with 3f205ff1d94374eaf8eae99927606bab076af5acc9940216dc2ef833c7dad51f not found: ID does not exist" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.277158 4903 scope.go:117] "RemoveContainer" containerID="a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9" Dec 02 23:34:30 crc kubenswrapper[4903]: E1202 23:34:30.277434 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9\": container with ID starting with a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9 not found: ID does not exist" containerID="a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.277456 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9"} err="failed to get container status \"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9\": rpc error: code = NotFound desc = could not find container \"a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9\": container with ID starting with a63dbaaf726309aa1d7c55bd18a7e28c04aa5d1004e2be46a3c3733987312db9 not found: ID does not exist" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.277472 4903 scope.go:117] "RemoveContainer" containerID="0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e" Dec 02 23:34:30 crc kubenswrapper[4903]: E1202 23:34:30.277702 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e\": container with ID starting with 0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e not found: ID does not exist" containerID="0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e" Dec 02 23:34:30 crc kubenswrapper[4903]: I1202 23:34:30.277723 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e"} err="failed to get container status \"0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e\": rpc error: code = NotFound desc = could not find container \"0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e\": container with ID starting with 0e9e9f7aeab637377bf645edcf3bb858686a299086759d58bcf646d98a08947e not found: ID does not exist" Dec 02 23:34:31 crc kubenswrapper[4903]: I1202 23:34:31.626830 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" path="/var/lib/kubelet/pods/be235e33-fdd1-46a4-aa9e-be70794d9e89/volumes" Dec 02 23:34:34 crc kubenswrapper[4903]: I1202 23:34:34.329678 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:34 crc kubenswrapper[4903]: I1202 23:34:34.383797 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:34 crc kubenswrapper[4903]: I1202 23:34:34.577709 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.227169 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sb79n" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="registry-server" containerID="cri-o://9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae" gracePeriod=2 Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.731461 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.797433 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities\") pod \"927ef157-9d8d-444e-ad6f-d24d853beb7f\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.797488 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content\") pod \"927ef157-9d8d-444e-ad6f-d24d853beb7f\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.797592 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppj8n\" (UniqueName: \"kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n\") pod \"927ef157-9d8d-444e-ad6f-d24d853beb7f\" (UID: \"927ef157-9d8d-444e-ad6f-d24d853beb7f\") " Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.798273 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities" (OuterVolumeSpecName: "utilities") pod "927ef157-9d8d-444e-ad6f-d24d853beb7f" (UID: "927ef157-9d8d-444e-ad6f-d24d853beb7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.803940 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n" (OuterVolumeSpecName: "kube-api-access-ppj8n") pod "927ef157-9d8d-444e-ad6f-d24d853beb7f" (UID: "927ef157-9d8d-444e-ad6f-d24d853beb7f"). InnerVolumeSpecName "kube-api-access-ppj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.899695 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppj8n\" (UniqueName: \"kubernetes.io/projected/927ef157-9d8d-444e-ad6f-d24d853beb7f-kube-api-access-ppj8n\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.900048 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:36 crc kubenswrapper[4903]: I1202 23:34:36.915876 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "927ef157-9d8d-444e-ad6f-d24d853beb7f" (UID: "927ef157-9d8d-444e-ad6f-d24d853beb7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.002085 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/927ef157-9d8d-444e-ad6f-d24d853beb7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.244175 4903 generic.go:334] "Generic (PLEG): container finished" podID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerID="9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae" exitCode=0 Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.244252 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sb79n" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.244244 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerDied","Data":"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae"} Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.246872 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sb79n" event={"ID":"927ef157-9d8d-444e-ad6f-d24d853beb7f","Type":"ContainerDied","Data":"770a20124c4dd23d2355ee5a963e93b077f519e61a2783858d99ef025e751247"} Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.247052 4903 scope.go:117] "RemoveContainer" containerID="9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.316864 4903 scope.go:117] "RemoveContainer" containerID="f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.326015 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.337524 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sb79n"] Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.340539 4903 scope.go:117] "RemoveContainer" containerID="f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.380144 4903 scope.go:117] "RemoveContainer" containerID="9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae" Dec 02 23:34:37 crc kubenswrapper[4903]: E1202 23:34:37.380644 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae\": container with ID starting with 9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae not found: ID does not exist" containerID="9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.380708 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae"} err="failed to get container status \"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae\": rpc error: code = NotFound desc = could not find container \"9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae\": container with ID starting with 9f29dfc5c5a17e1d40ab498659726d1bf0adf6dbb2a347087e8b85ced16c6cae not found: ID does not exist" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.380736 4903 scope.go:117] "RemoveContainer" containerID="f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79" Dec 02 23:34:37 crc kubenswrapper[4903]: E1202 23:34:37.381133 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79\": container with ID starting with f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79 not found: ID does not exist" containerID="f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.381169 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79"} err="failed to get container status \"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79\": rpc error: code = NotFound desc = could not find container \"f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79\": container with ID starting with f85cd8615751b3fd4100d099336a5ab720873141f21c9591ce95fd761e390e79 not found: ID does not exist" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.381195 4903 scope.go:117] "RemoveContainer" containerID="f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3" Dec 02 23:34:37 crc kubenswrapper[4903]: E1202 23:34:37.381623 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3\": container with ID starting with f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3 not found: ID does not exist" containerID="f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.381675 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3"} err="failed to get container status \"f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3\": rpc error: code = NotFound desc = could not find container \"f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3\": container with ID starting with f5dbbbad7f822e5ffed68cd8d50076acf2ffe6f6e6d18023b89c5a8518e674f3 not found: ID does not exist" Dec 02 23:34:37 crc kubenswrapper[4903]: I1202 23:34:37.633366 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" path="/var/lib/kubelet/pods/927ef157-9d8d-444e-ad6f-d24d853beb7f/volumes" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.069722 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.070252 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.070300 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.071182 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.071247 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" gracePeriod=600 Dec 02 23:34:53 crc kubenswrapper[4903]: E1202 23:34:53.233163 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.464208 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" exitCode=0 Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.464258 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d"} Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.464297 4903 scope.go:117] "RemoveContainer" containerID="2cb2324e92342142e01e709d222456907de7cfe5f6624d7d43f5bea9f5edfe1a" Dec 02 23:34:53 crc kubenswrapper[4903]: I1202 23:34:53.465284 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:34:53 crc kubenswrapper[4903]: E1202 23:34:53.465819 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:35:05 crc kubenswrapper[4903]: I1202 23:35:05.612907 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:35:05 crc kubenswrapper[4903]: E1202 23:35:05.613496 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:35:17 crc kubenswrapper[4903]: I1202 23:35:17.613538 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:35:17 crc kubenswrapper[4903]: E1202 23:35:17.614635 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:35:29 crc kubenswrapper[4903]: I1202 23:35:29.612507 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:35:29 crc kubenswrapper[4903]: E1202 23:35:29.613405 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:35:40 crc kubenswrapper[4903]: I1202 23:35:40.612364 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:35:40 crc kubenswrapper[4903]: E1202 23:35:40.613256 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:35:52 crc kubenswrapper[4903]: I1202 23:35:52.612796 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:35:52 crc kubenswrapper[4903]: E1202 23:35:52.613920 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:36:04 crc kubenswrapper[4903]: I1202 23:36:04.613225 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:36:04 crc kubenswrapper[4903]: E1202 23:36:04.614162 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:36:15 crc kubenswrapper[4903]: I1202 23:36:15.612787 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:36:15 crc kubenswrapper[4903]: E1202 23:36:15.613580 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:36:29 crc kubenswrapper[4903]: I1202 23:36:29.614208 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:36:29 crc kubenswrapper[4903]: E1202 23:36:29.615676 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:36:43 crc kubenswrapper[4903]: I1202 23:36:43.612290 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:36:43 crc kubenswrapper[4903]: E1202 23:36:43.613302 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:36:56 crc kubenswrapper[4903]: I1202 23:36:56.612574 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:36:56 crc kubenswrapper[4903]: E1202 23:36:56.613419 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:37:11 crc kubenswrapper[4903]: I1202 23:37:11.620197 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:37:11 crc kubenswrapper[4903]: E1202 23:37:11.621187 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:37:22 crc kubenswrapper[4903]: I1202 23:37:22.614146 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:37:22 crc kubenswrapper[4903]: E1202 23:37:22.615364 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:37:36 crc kubenswrapper[4903]: I1202 23:37:36.613121 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:37:36 crc kubenswrapper[4903]: E1202 23:37:36.614036 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:37:49 crc kubenswrapper[4903]: I1202 23:37:49.612814 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:37:49 crc kubenswrapper[4903]: E1202 23:37:49.613766 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:38:02 crc kubenswrapper[4903]: I1202 23:38:02.612393 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:38:02 crc kubenswrapper[4903]: E1202 23:38:02.613566 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:38:17 crc kubenswrapper[4903]: I1202 23:38:17.613275 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:38:17 crc kubenswrapper[4903]: E1202 23:38:17.614194 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:38:29 crc kubenswrapper[4903]: I1202 23:38:29.613609 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:38:29 crc kubenswrapper[4903]: E1202 23:38:29.614940 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:38:40 crc kubenswrapper[4903]: I1202 23:38:40.612916 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:38:40 crc kubenswrapper[4903]: E1202 23:38:40.613886 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:38:46 crc kubenswrapper[4903]: I1202 23:38:46.740569 4903 generic.go:334] "Generic (PLEG): container finished" podID="c9427e93-561b-4f09-bcec-00c7001f2541" containerID="d4d7d2ff6e578e5ce57ac3a2a0e159de501fd4812fad7926b4dd2dc8026a55ea" exitCode=0 Dec 02 23:38:46 crc kubenswrapper[4903]: I1202 23:38:46.740709 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" event={"ID":"c9427e93-561b-4f09-bcec-00c7001f2541","Type":"ContainerDied","Data":"d4d7d2ff6e578e5ce57ac3a2a0e159de501fd4812fad7926b4dd2dc8026a55ea"} Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.266071 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.391210 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle\") pod \"c9427e93-561b-4f09-bcec-00c7001f2541\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.391399 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0\") pod \"c9427e93-561b-4f09-bcec-00c7001f2541\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.391484 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tzr2\" (UniqueName: \"kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2\") pod \"c9427e93-561b-4f09-bcec-00c7001f2541\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.391571 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory\") pod \"c9427e93-561b-4f09-bcec-00c7001f2541\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.391613 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key\") pod \"c9427e93-561b-4f09-bcec-00c7001f2541\" (UID: \"c9427e93-561b-4f09-bcec-00c7001f2541\") " Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.402670 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2" (OuterVolumeSpecName: "kube-api-access-4tzr2") pod "c9427e93-561b-4f09-bcec-00c7001f2541" (UID: "c9427e93-561b-4f09-bcec-00c7001f2541"). InnerVolumeSpecName "kube-api-access-4tzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.410928 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c9427e93-561b-4f09-bcec-00c7001f2541" (UID: "c9427e93-561b-4f09-bcec-00c7001f2541"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.445445 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c9427e93-561b-4f09-bcec-00c7001f2541" (UID: "c9427e93-561b-4f09-bcec-00c7001f2541"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.445994 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory" (OuterVolumeSpecName: "inventory") pod "c9427e93-561b-4f09-bcec-00c7001f2541" (UID: "c9427e93-561b-4f09-bcec-00c7001f2541"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.452316 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c9427e93-561b-4f09-bcec-00c7001f2541" (UID: "c9427e93-561b-4f09-bcec-00c7001f2541"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.494389 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tzr2\" (UniqueName: \"kubernetes.io/projected/c9427e93-561b-4f09-bcec-00c7001f2541-kube-api-access-4tzr2\") on node \"crc\" DevicePath \"\"" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.494531 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.494551 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.494574 4903 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.494593 4903 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c9427e93-561b-4f09-bcec-00c7001f2541-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.767171 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" event={"ID":"c9427e93-561b-4f09-bcec-00c7001f2541","Type":"ContainerDied","Data":"77e9898d24bc6bb787f06b4a536c1dbb19b66e8d8cd0692b570936a60d17626d"} Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.767233 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e9898d24bc6bb787f06b4a536c1dbb19b66e8d8cd0692b570936a60d17626d" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.767273 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6s9st" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.951334 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln"] Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952145 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952179 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952211 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9427e93-561b-4f09-bcec-00c7001f2541" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952225 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9427e93-561b-4f09-bcec-00c7001f2541" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952260 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="extract-content" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952273 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="extract-content" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952298 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952310 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952327 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="extract-utilities" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952339 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="extract-utilities" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952363 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="extract-utilities" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952377 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="extract-utilities" Dec 02 23:38:48 crc kubenswrapper[4903]: E1202 23:38:48.952405 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="extract-content" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952417 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="extract-content" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952862 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9427e93-561b-4f09-bcec-00c7001f2541" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952916 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="be235e33-fdd1-46a4-aa9e-be70794d9e89" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.952934 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="927ef157-9d8d-444e-ad6f-d24d853beb7f" containerName="registry-server" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.954400 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.958855 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.958880 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.959088 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.960009 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.960071 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.960080 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.966248 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln"] Dec 02 23:38:48 crc kubenswrapper[4903]: I1202 23:38:48.966539 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.108528 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.108600 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.108893 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109010 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109088 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109173 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109251 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.109336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq94b\" (UniqueName: \"kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211282 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211382 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211559 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211611 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211841 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.211963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq94b\" (UniqueName: \"kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.213818 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.218072 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.218245 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.222084 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.222095 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.222548 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.229291 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.231374 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.236914 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq94b\" (UniqueName: \"kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m6rln\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.291454 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.707643 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln"] Dec 02 23:38:49 crc kubenswrapper[4903]: W1202 23:38:49.711611 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013ce0d7_062b_47a7_8831_912380a94a37.slice/crio-387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4 WatchSource:0}: Error finding container 387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4: Status 404 returned error can't find the container with id 387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4 Dec 02 23:38:49 crc kubenswrapper[4903]: I1202 23:38:49.777882 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" event={"ID":"013ce0d7-062b-47a7-8831-912380a94a37","Type":"ContainerStarted","Data":"387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4"} Dec 02 23:38:50 crc kubenswrapper[4903]: I1202 23:38:50.800056 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" event={"ID":"013ce0d7-062b-47a7-8831-912380a94a37","Type":"ContainerStarted","Data":"95cf1d58b9d1e83c385b5ae4f85af4452a5179ccdeffff6eb13c154864130a48"} Dec 02 23:38:50 crc kubenswrapper[4903]: I1202 23:38:50.826897 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" podStartSLOduration=2.269943344 podStartE2EDuration="2.826879703s" podCreationTimestamp="2025-12-02 23:38:48 +0000 UTC" firstStartedPulling="2025-12-02 23:38:49.714526992 +0000 UTC m=+2468.423081275" lastFinishedPulling="2025-12-02 23:38:50.271463351 +0000 UTC m=+2468.980017634" observedRunningTime="2025-12-02 23:38:50.818207994 +0000 UTC m=+2469.526762277" watchObservedRunningTime="2025-12-02 23:38:50.826879703 +0000 UTC m=+2469.535433986" Dec 02 23:38:53 crc kubenswrapper[4903]: I1202 23:38:53.613645 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:38:53 crc kubenswrapper[4903]: E1202 23:38:53.614713 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:39:05 crc kubenswrapper[4903]: I1202 23:39:05.613376 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:39:05 crc kubenswrapper[4903]: E1202 23:39:05.614782 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:39:19 crc kubenswrapper[4903]: I1202 23:39:19.612553 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:39:19 crc kubenswrapper[4903]: E1202 23:39:19.613625 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:39:32 crc kubenswrapper[4903]: I1202 23:39:32.612508 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:39:32 crc kubenswrapper[4903]: E1202 23:39:32.613092 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:39:47 crc kubenswrapper[4903]: I1202 23:39:47.613235 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:39:47 crc kubenswrapper[4903]: E1202 23:39:47.615302 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:40:02 crc kubenswrapper[4903]: I1202 23:40:02.611898 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:40:03 crc kubenswrapper[4903]: I1202 23:40:03.661328 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f"} Dec 02 23:40:47 crc kubenswrapper[4903]: I1202 23:40:47.958318 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:40:47 crc kubenswrapper[4903]: I1202 23:40:47.962003 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:47 crc kubenswrapper[4903]: I1202 23:40:47.982483 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.044678 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccbml\" (UniqueName: \"kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.044726 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.044753 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.147081 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccbml\" (UniqueName: \"kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.147139 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.147163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.147841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.147903 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.171913 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccbml\" (UniqueName: \"kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml\") pod \"certified-operators-6ghqq\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.282697 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:48 crc kubenswrapper[4903]: I1202 23:40:48.810051 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:40:49 crc kubenswrapper[4903]: I1202 23:40:49.205848 4903 generic.go:334] "Generic (PLEG): container finished" podID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerID="87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818" exitCode=0 Dec 02 23:40:49 crc kubenswrapper[4903]: I1202 23:40:49.205933 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerDied","Data":"87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818"} Dec 02 23:40:49 crc kubenswrapper[4903]: I1202 23:40:49.206868 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerStarted","Data":"b773a2e8561465a6100ec0d8d554e234cdb1d8c5ae4bcaae57de125984b0f5f1"} Dec 02 23:40:49 crc kubenswrapper[4903]: I1202 23:40:49.211018 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:40:51 crc kubenswrapper[4903]: I1202 23:40:51.233074 4903 generic.go:334] "Generic (PLEG): container finished" podID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerID="4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203" exitCode=0 Dec 02 23:40:51 crc kubenswrapper[4903]: I1202 23:40:51.233153 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerDied","Data":"4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203"} Dec 02 23:40:52 crc kubenswrapper[4903]: I1202 23:40:52.251357 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerStarted","Data":"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94"} Dec 02 23:40:52 crc kubenswrapper[4903]: I1202 23:40:52.284638 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6ghqq" podStartSLOduration=2.83914655 podStartE2EDuration="5.284619808s" podCreationTimestamp="2025-12-02 23:40:47 +0000 UTC" firstStartedPulling="2025-12-02 23:40:49.210692363 +0000 UTC m=+2587.919246666" lastFinishedPulling="2025-12-02 23:40:51.656165631 +0000 UTC m=+2590.364719924" observedRunningTime="2025-12-02 23:40:52.281538094 +0000 UTC m=+2590.990092387" watchObservedRunningTime="2025-12-02 23:40:52.284619808 +0000 UTC m=+2590.993174101" Dec 02 23:40:58 crc kubenswrapper[4903]: I1202 23:40:58.282917 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:58 crc kubenswrapper[4903]: I1202 23:40:58.283747 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:58 crc kubenswrapper[4903]: I1202 23:40:58.360845 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:59 crc kubenswrapper[4903]: I1202 23:40:59.456144 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:40:59 crc kubenswrapper[4903]: I1202 23:40:59.533347 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:41:01 crc kubenswrapper[4903]: I1202 23:41:01.389332 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6ghqq" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="registry-server" containerID="cri-o://efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94" gracePeriod=2 Dec 02 23:41:01 crc kubenswrapper[4903]: I1202 23:41:01.926304 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.077085 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccbml\" (UniqueName: \"kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml\") pod \"999c2f0d-579f-47df-a9fb-78897bbb91da\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.077199 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities\") pod \"999c2f0d-579f-47df-a9fb-78897bbb91da\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.077323 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content\") pod \"999c2f0d-579f-47df-a9fb-78897bbb91da\" (UID: \"999c2f0d-579f-47df-a9fb-78897bbb91da\") " Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.078219 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities" (OuterVolumeSpecName: "utilities") pod "999c2f0d-579f-47df-a9fb-78897bbb91da" (UID: "999c2f0d-579f-47df-a9fb-78897bbb91da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.084456 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml" (OuterVolumeSpecName: "kube-api-access-ccbml") pod "999c2f0d-579f-47df-a9fb-78897bbb91da" (UID: "999c2f0d-579f-47df-a9fb-78897bbb91da"). InnerVolumeSpecName "kube-api-access-ccbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.129543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "999c2f0d-579f-47df-a9fb-78897bbb91da" (UID: "999c2f0d-579f-47df-a9fb-78897bbb91da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.180959 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccbml\" (UniqueName: \"kubernetes.io/projected/999c2f0d-579f-47df-a9fb-78897bbb91da-kube-api-access-ccbml\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.180994 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.181003 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999c2f0d-579f-47df-a9fb-78897bbb91da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.400137 4903 generic.go:334] "Generic (PLEG): container finished" podID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerID="efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94" exitCode=0 Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.400180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerDied","Data":"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94"} Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.400202 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ghqq" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.400207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ghqq" event={"ID":"999c2f0d-579f-47df-a9fb-78897bbb91da","Type":"ContainerDied","Data":"b773a2e8561465a6100ec0d8d554e234cdb1d8c5ae4bcaae57de125984b0f5f1"} Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.400239 4903 scope.go:117] "RemoveContainer" containerID="efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.420959 4903 scope.go:117] "RemoveContainer" containerID="4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.442739 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.454615 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6ghqq"] Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.477908 4903 scope.go:117] "RemoveContainer" containerID="87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.513447 4903 scope.go:117] "RemoveContainer" containerID="efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94" Dec 02 23:41:02 crc kubenswrapper[4903]: E1202 23:41:02.514141 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94\": container with ID starting with efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94 not found: ID does not exist" containerID="efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.514191 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94"} err="failed to get container status \"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94\": rpc error: code = NotFound desc = could not find container \"efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94\": container with ID starting with efc8d0608c09414640101400072c092342bfc0a1bff6ebf4fdda7dd8c0177b94 not found: ID does not exist" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.514221 4903 scope.go:117] "RemoveContainer" containerID="4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203" Dec 02 23:41:02 crc kubenswrapper[4903]: E1202 23:41:02.514704 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203\": container with ID starting with 4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203 not found: ID does not exist" containerID="4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.514740 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203"} err="failed to get container status \"4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203\": rpc error: code = NotFound desc = could not find container \"4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203\": container with ID starting with 4ff7dde1150d6159ceb36141d1c92f96f904524c580f41d9f56b1faaba6cc203 not found: ID does not exist" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.514762 4903 scope.go:117] "RemoveContainer" containerID="87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818" Dec 02 23:41:02 crc kubenswrapper[4903]: E1202 23:41:02.515071 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818\": container with ID starting with 87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818 not found: ID does not exist" containerID="87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818" Dec 02 23:41:02 crc kubenswrapper[4903]: I1202 23:41:02.515097 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818"} err="failed to get container status \"87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818\": rpc error: code = NotFound desc = could not find container \"87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818\": container with ID starting with 87861ffe202615731a729fc3b4dbc7c9e2ec474dd5bf78746ef0b29249dd7818 not found: ID does not exist" Dec 02 23:41:03 crc kubenswrapper[4903]: I1202 23:41:03.632713 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" path="/var/lib/kubelet/pods/999c2f0d-579f-47df-a9fb-78897bbb91da/volumes" Dec 02 23:42:18 crc kubenswrapper[4903]: I1202 23:42:18.357613 4903 generic.go:334] "Generic (PLEG): container finished" podID="013ce0d7-062b-47a7-8831-912380a94a37" containerID="95cf1d58b9d1e83c385b5ae4f85af4452a5179ccdeffff6eb13c154864130a48" exitCode=0 Dec 02 23:42:18 crc kubenswrapper[4903]: I1202 23:42:18.357703 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" event={"ID":"013ce0d7-062b-47a7-8831-912380a94a37","Type":"ContainerDied","Data":"95cf1d58b9d1e83c385b5ae4f85af4452a5179ccdeffff6eb13c154864130a48"} Dec 02 23:42:19 crc kubenswrapper[4903]: I1202 23:42:19.947367 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.112915 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113092 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq94b\" (UniqueName: \"kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113237 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113277 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113358 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113401 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113483 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.113600 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1\") pod \"013ce0d7-062b-47a7-8831-912380a94a37\" (UID: \"013ce0d7-062b-47a7-8831-912380a94a37\") " Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.120585 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b" (OuterVolumeSpecName: "kube-api-access-wq94b") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "kube-api-access-wq94b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.121789 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.143107 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.150274 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.169152 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.170460 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.174916 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.177034 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.186466 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory" (OuterVolumeSpecName: "inventory") pod "013ce0d7-062b-47a7-8831-912380a94a37" (UID: "013ce0d7-062b-47a7-8831-912380a94a37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.215927 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.215964 4903 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.215980 4903 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.215992 4903 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/013ce0d7-062b-47a7-8831-912380a94a37-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.216005 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq94b\" (UniqueName: \"kubernetes.io/projected/013ce0d7-062b-47a7-8831-912380a94a37-kube-api-access-wq94b\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.216020 4903 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.216032 4903 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.216042 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.216053 4903 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/013ce0d7-062b-47a7-8831-912380a94a37-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.387226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" event={"ID":"013ce0d7-062b-47a7-8831-912380a94a37","Type":"ContainerDied","Data":"387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4"} Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.387283 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="387c260b4748e7edba3cd112639469d403ec8be10ac893671ea08c02d51f1cd4" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.387384 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m6rln" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.489980 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7"] Dec 02 23:42:20 crc kubenswrapper[4903]: E1202 23:42:20.490349 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="extract-utilities" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490365 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="extract-utilities" Dec 02 23:42:20 crc kubenswrapper[4903]: E1202 23:42:20.490376 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="registry-server" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490383 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="registry-server" Dec 02 23:42:20 crc kubenswrapper[4903]: E1202 23:42:20.490398 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="extract-content" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490403 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="extract-content" Dec 02 23:42:20 crc kubenswrapper[4903]: E1202 23:42:20.490416 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013ce0d7-062b-47a7-8831-912380a94a37" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490423 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="013ce0d7-062b-47a7-8831-912380a94a37" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490598 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="999c2f0d-579f-47df-a9fb-78897bbb91da" containerName="registry-server" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.490607 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="013ce0d7-062b-47a7-8831-912380a94a37" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.491239 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.494883 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.495150 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.495255 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9z6rx" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.495750 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.519148 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7"] Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550144 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550542 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550620 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550680 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550797 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550870 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4vhn\" (UniqueName: \"kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.550928 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.551140 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.653429 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.653547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.653675 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.653929 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.654045 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4vhn\" (UniqueName: \"kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.654094 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.654453 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.657603 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.658103 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.658591 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.658672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.659231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.660618 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.677875 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4vhn\" (UniqueName: \"kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fkks7\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:20 crc kubenswrapper[4903]: I1202 23:42:20.857396 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:42:21 crc kubenswrapper[4903]: W1202 23:42:21.446575 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a2b87ac_e673_475f_9ebc_d3387b0e26f2.slice/crio-cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38 WatchSource:0}: Error finding container cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38: Status 404 returned error can't find the container with id cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38 Dec 02 23:42:21 crc kubenswrapper[4903]: I1202 23:42:21.450024 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7"] Dec 02 23:42:22 crc kubenswrapper[4903]: I1202 23:42:22.429896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" event={"ID":"2a2b87ac-e673-475f-9ebc-d3387b0e26f2","Type":"ContainerStarted","Data":"5d0ded2315cf494d6f44c96b558292d2cf1781bee9a2db9b6a6769336ce675dc"} Dec 02 23:42:22 crc kubenswrapper[4903]: I1202 23:42:22.430267 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" event={"ID":"2a2b87ac-e673-475f-9ebc-d3387b0e26f2","Type":"ContainerStarted","Data":"cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38"} Dec 02 23:42:22 crc kubenswrapper[4903]: I1202 23:42:22.458168 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" podStartSLOduration=1.941887539 podStartE2EDuration="2.458150416s" podCreationTimestamp="2025-12-02 23:42:20 +0000 UTC" firstStartedPulling="2025-12-02 23:42:21.454140302 +0000 UTC m=+2680.162694615" lastFinishedPulling="2025-12-02 23:42:21.970403199 +0000 UTC m=+2680.678957492" observedRunningTime="2025-12-02 23:42:22.456322152 +0000 UTC m=+2681.164876475" watchObservedRunningTime="2025-12-02 23:42:22.458150416 +0000 UTC m=+2681.166704689" Dec 02 23:42:23 crc kubenswrapper[4903]: I1202 23:42:23.069924 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:42:23 crc kubenswrapper[4903]: I1202 23:42:23.070310 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:42:53 crc kubenswrapper[4903]: I1202 23:42:53.070065 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:42:53 crc kubenswrapper[4903]: I1202 23:42:53.070971 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:43:23 crc kubenswrapper[4903]: I1202 23:43:23.070142 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:43:23 crc kubenswrapper[4903]: I1202 23:43:23.071135 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:43:23 crc kubenswrapper[4903]: I1202 23:43:23.071212 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:43:23 crc kubenswrapper[4903]: I1202 23:43:23.072532 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:43:23 crc kubenswrapper[4903]: I1202 23:43:23.072647 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f" gracePeriod=600 Dec 02 23:43:24 crc kubenswrapper[4903]: I1202 23:43:24.186355 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f" exitCode=0 Dec 02 23:43:24 crc kubenswrapper[4903]: I1202 23:43:24.186457 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f"} Dec 02 23:43:24 crc kubenswrapper[4903]: I1202 23:43:24.187243 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181"} Dec 02 23:43:24 crc kubenswrapper[4903]: I1202 23:43:24.187270 4903 scope.go:117] "RemoveContainer" containerID="cb019dd8555c680c4295d44d0c399ca95f9121e180d4cfb927832eb9bf96512d" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.478296 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.481156 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.496153 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.496223 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.496394 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g28vv\" (UniqueName: \"kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.499672 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.597960 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g28vv\" (UniqueName: \"kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.598097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.598153 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.598598 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.598638 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.617122 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g28vv\" (UniqueName: \"kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv\") pod \"redhat-marketplace-wxk97\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:22 crc kubenswrapper[4903]: I1202 23:44:22.809600 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:23 crc kubenswrapper[4903]: I1202 23:44:23.318019 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:23 crc kubenswrapper[4903]: I1202 23:44:23.931143 4903 generic.go:334] "Generic (PLEG): container finished" podID="dc980d15-c176-4530-a0e8-701277b0160b" containerID="41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6" exitCode=0 Dec 02 23:44:23 crc kubenswrapper[4903]: I1202 23:44:23.931489 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerDied","Data":"41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6"} Dec 02 23:44:23 crc kubenswrapper[4903]: I1202 23:44:23.931530 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerStarted","Data":"e2c7a1da0225459aaaa7170864a1c9db36acf3dda4aab9aadb3a492101c2110b"} Dec 02 23:44:24 crc kubenswrapper[4903]: I1202 23:44:24.943193 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerStarted","Data":"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c"} Dec 02 23:44:25 crc kubenswrapper[4903]: I1202 23:44:25.960134 4903 generic.go:334] "Generic (PLEG): container finished" podID="dc980d15-c176-4530-a0e8-701277b0160b" containerID="ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c" exitCode=0 Dec 02 23:44:25 crc kubenswrapper[4903]: I1202 23:44:25.960456 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerDied","Data":"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c"} Dec 02 23:44:26 crc kubenswrapper[4903]: I1202 23:44:26.974618 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerStarted","Data":"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3"} Dec 02 23:44:27 crc kubenswrapper[4903]: I1202 23:44:27.008699 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wxk97" podStartSLOduration=2.459812901 podStartE2EDuration="5.008678096s" podCreationTimestamp="2025-12-02 23:44:22 +0000 UTC" firstStartedPulling="2025-12-02 23:44:23.933357789 +0000 UTC m=+2802.641912102" lastFinishedPulling="2025-12-02 23:44:26.482222984 +0000 UTC m=+2805.190777297" observedRunningTime="2025-12-02 23:44:26.999424183 +0000 UTC m=+2805.707978516" watchObservedRunningTime="2025-12-02 23:44:27.008678096 +0000 UTC m=+2805.717232389" Dec 02 23:44:32 crc kubenswrapper[4903]: I1202 23:44:32.811071 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:32 crc kubenswrapper[4903]: I1202 23:44:32.811785 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:32 crc kubenswrapper[4903]: I1202 23:44:32.888246 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:33 crc kubenswrapper[4903]: I1202 23:44:33.118870 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:33 crc kubenswrapper[4903]: I1202 23:44:33.173681 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.066478 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wxk97" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="registry-server" containerID="cri-o://61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3" gracePeriod=2 Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.594789 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.740893 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g28vv\" (UniqueName: \"kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv\") pod \"dc980d15-c176-4530-a0e8-701277b0160b\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.740996 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities\") pod \"dc980d15-c176-4530-a0e8-701277b0160b\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.741020 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content\") pod \"dc980d15-c176-4530-a0e8-701277b0160b\" (UID: \"dc980d15-c176-4530-a0e8-701277b0160b\") " Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.744009 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities" (OuterVolumeSpecName: "utilities") pod "dc980d15-c176-4530-a0e8-701277b0160b" (UID: "dc980d15-c176-4530-a0e8-701277b0160b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.751819 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv" (OuterVolumeSpecName: "kube-api-access-g28vv") pod "dc980d15-c176-4530-a0e8-701277b0160b" (UID: "dc980d15-c176-4530-a0e8-701277b0160b"). InnerVolumeSpecName "kube-api-access-g28vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.798969 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc980d15-c176-4530-a0e8-701277b0160b" (UID: "dc980d15-c176-4530-a0e8-701277b0160b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.847913 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.847950 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc980d15-c176-4530-a0e8-701277b0160b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:35 crc kubenswrapper[4903]: I1202 23:44:35.847960 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g28vv\" (UniqueName: \"kubernetes.io/projected/dc980d15-c176-4530-a0e8-701277b0160b-kube-api-access-g28vv\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.080637 4903 generic.go:334] "Generic (PLEG): container finished" podID="dc980d15-c176-4530-a0e8-701277b0160b" containerID="61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3" exitCode=0 Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.080686 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerDied","Data":"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3"} Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.080744 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wxk97" event={"ID":"dc980d15-c176-4530-a0e8-701277b0160b","Type":"ContainerDied","Data":"e2c7a1da0225459aaaa7170864a1c9db36acf3dda4aab9aadb3a492101c2110b"} Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.080767 4903 scope.go:117] "RemoveContainer" containerID="61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.080767 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wxk97" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.117076 4903 scope.go:117] "RemoveContainer" containerID="ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.143536 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.154486 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wxk97"] Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.169981 4903 scope.go:117] "RemoveContainer" containerID="41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.243123 4903 scope.go:117] "RemoveContainer" containerID="61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3" Dec 02 23:44:36 crc kubenswrapper[4903]: E1202 23:44:36.243529 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3\": container with ID starting with 61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3 not found: ID does not exist" containerID="61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.243557 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3"} err="failed to get container status \"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3\": rpc error: code = NotFound desc = could not find container \"61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3\": container with ID starting with 61eb94dfcac6512245af9f90e6b29de97da78d7946acb6f399db6d821c14d1d3 not found: ID does not exist" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.243578 4903 scope.go:117] "RemoveContainer" containerID="ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c" Dec 02 23:44:36 crc kubenswrapper[4903]: E1202 23:44:36.243853 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c\": container with ID starting with ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c not found: ID does not exist" containerID="ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.243883 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c"} err="failed to get container status \"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c\": rpc error: code = NotFound desc = could not find container \"ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c\": container with ID starting with ec3a1d052f72a1f578f0e6411270b8199d706c041d7ff5d18bb198c9793f2d1c not found: ID does not exist" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.243904 4903 scope.go:117] "RemoveContainer" containerID="41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6" Dec 02 23:44:36 crc kubenswrapper[4903]: E1202 23:44:36.244263 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6\": container with ID starting with 41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6 not found: ID does not exist" containerID="41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6" Dec 02 23:44:36 crc kubenswrapper[4903]: I1202 23:44:36.244304 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6"} err="failed to get container status \"41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6\": rpc error: code = NotFound desc = could not find container \"41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6\": container with ID starting with 41ee2499f193028c5926eaee34b1d61d0039e5a05614f96be18f966cd74797c6 not found: ID does not exist" Dec 02 23:44:37 crc kubenswrapper[4903]: I1202 23:44:37.633519 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc980d15-c176-4530-a0e8-701277b0160b" path="/var/lib/kubelet/pods/dc980d15-c176-4530-a0e8-701277b0160b/volumes" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.553133 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:44:38 crc kubenswrapper[4903]: E1202 23:44:38.554396 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="registry-server" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.554422 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="registry-server" Dec 02 23:44:38 crc kubenswrapper[4903]: E1202 23:44:38.554453 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="extract-utilities" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.554464 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="extract-utilities" Dec 02 23:44:38 crc kubenswrapper[4903]: E1202 23:44:38.554495 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="extract-content" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.554505 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="extract-content" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.554830 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc980d15-c176-4530-a0e8-701277b0160b" containerName="registry-server" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.557889 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.575261 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.707775 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.707875 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfbk\" (UniqueName: \"kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.707966 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.810150 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.810426 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.810754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfbk\" (UniqueName: \"kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.811480 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.812232 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.841955 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfbk\" (UniqueName: \"kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk\") pod \"redhat-operators-ksl4b\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:38 crc kubenswrapper[4903]: I1202 23:44:38.916366 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:39 crc kubenswrapper[4903]: I1202 23:44:39.379598 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:44:40 crc kubenswrapper[4903]: I1202 23:44:40.138814 4903 generic.go:334] "Generic (PLEG): container finished" podID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerID="3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53" exitCode=0 Dec 02 23:44:40 crc kubenswrapper[4903]: I1202 23:44:40.138946 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerDied","Data":"3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53"} Dec 02 23:44:40 crc kubenswrapper[4903]: I1202 23:44:40.139249 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerStarted","Data":"60ff4c3925de75c9504420b76c9e07b01f586ccb680dafac58fa7405d5e86288"} Dec 02 23:44:42 crc kubenswrapper[4903]: I1202 23:44:42.168518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerStarted","Data":"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e"} Dec 02 23:44:44 crc kubenswrapper[4903]: I1202 23:44:44.195558 4903 generic.go:334] "Generic (PLEG): container finished" podID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerID="7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e" exitCode=0 Dec 02 23:44:44 crc kubenswrapper[4903]: I1202 23:44:44.196031 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerDied","Data":"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e"} Dec 02 23:44:45 crc kubenswrapper[4903]: I1202 23:44:45.217627 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerStarted","Data":"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502"} Dec 02 23:44:45 crc kubenswrapper[4903]: I1202 23:44:45.249071 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ksl4b" podStartSLOduration=2.788361231 podStartE2EDuration="7.249053529s" podCreationTimestamp="2025-12-02 23:44:38 +0000 UTC" firstStartedPulling="2025-12-02 23:44:40.143075486 +0000 UTC m=+2818.851629799" lastFinishedPulling="2025-12-02 23:44:44.603767784 +0000 UTC m=+2823.312322097" observedRunningTime="2025-12-02 23:44:45.244907698 +0000 UTC m=+2823.953462011" watchObservedRunningTime="2025-12-02 23:44:45.249053529 +0000 UTC m=+2823.957607812" Dec 02 23:44:48 crc kubenswrapper[4903]: I1202 23:44:48.917401 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:48 crc kubenswrapper[4903]: I1202 23:44:48.918283 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:50 crc kubenswrapper[4903]: I1202 23:44:50.015408 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ksl4b" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="registry-server" probeResult="failure" output=< Dec 02 23:44:50 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:44:50 crc kubenswrapper[4903]: > Dec 02 23:44:54 crc kubenswrapper[4903]: I1202 23:44:54.344109 4903 generic.go:334] "Generic (PLEG): container finished" podID="2a2b87ac-e673-475f-9ebc-d3387b0e26f2" containerID="5d0ded2315cf494d6f44c96b558292d2cf1781bee9a2db9b6a6769336ce675dc" exitCode=0 Dec 02 23:44:54 crc kubenswrapper[4903]: I1202 23:44:54.344569 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" event={"ID":"2a2b87ac-e673-475f-9ebc-d3387b0e26f2","Type":"ContainerDied","Data":"5d0ded2315cf494d6f44c96b558292d2cf1781bee9a2db9b6a6769336ce675dc"} Dec 02 23:44:55 crc kubenswrapper[4903]: I1202 23:44:55.899992 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.051647 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.051820 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.052076 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.052116 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.052213 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4vhn\" (UniqueName: \"kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.052252 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.052294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0\") pod \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\" (UID: \"2a2b87ac-e673-475f-9ebc-d3387b0e26f2\") " Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.058633 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn" (OuterVolumeSpecName: "kube-api-access-h4vhn") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "kube-api-access-h4vhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.065916 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.094720 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory" (OuterVolumeSpecName: "inventory") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.095588 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.120149 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.124224 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.141007 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2a2b87ac-e673-475f-9ebc-d3387b0e26f2" (UID: "2a2b87ac-e673-475f-9ebc-d3387b0e26f2"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156146 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4vhn\" (UniqueName: \"kubernetes.io/projected/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-kube-api-access-h4vhn\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156191 4903 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156205 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156221 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156234 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156247 4903 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.156257 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a2b87ac-e673-475f-9ebc-d3387b0e26f2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.370041 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" event={"ID":"2a2b87ac-e673-475f-9ebc-d3387b0e26f2","Type":"ContainerDied","Data":"cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38"} Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.370102 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca8c44e5ebbd0017c1317b2f7b161c03b177970b3682804dfd87b6c02a52b38" Dec 02 23:44:56 crc kubenswrapper[4903]: I1202 23:44:56.370175 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fkks7" Dec 02 23:44:59 crc kubenswrapper[4903]: I1202 23:44:59.001441 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:59 crc kubenswrapper[4903]: I1202 23:44:59.054863 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:44:59 crc kubenswrapper[4903]: I1202 23:44:59.255617 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.173904 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4"] Dec 02 23:45:00 crc kubenswrapper[4903]: E1202 23:45:00.174299 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2b87ac-e673-475f-9ebc-d3387b0e26f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.174325 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2b87ac-e673-475f-9ebc-d3387b0e26f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.174520 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a2b87ac-e673-475f-9ebc-d3387b0e26f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.175169 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.177409 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.177743 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.187126 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4"] Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.350169 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.351023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.351996 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6kh\" (UniqueName: \"kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.422588 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ksl4b" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="registry-server" containerID="cri-o://b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502" gracePeriod=2 Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.453812 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.453882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6kh\" (UniqueName: \"kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.453937 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.454925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.464210 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.476964 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6kh\" (UniqueName: \"kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh\") pod \"collect-profiles-29411985-xfnl4\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.491879 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:00 crc kubenswrapper[4903]: I1202 23:45:00.927370 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.011280 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4"] Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.066178 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content\") pod \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.066322 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities\") pod \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.066376 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnfbk\" (UniqueName: \"kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk\") pod \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\" (UID: \"fa1d0c57-f336-41c0-b65d-c4d8928b734c\") " Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.067315 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities" (OuterVolumeSpecName: "utilities") pod "fa1d0c57-f336-41c0-b65d-c4d8928b734c" (UID: "fa1d0c57-f336-41c0-b65d-c4d8928b734c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.072352 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk" (OuterVolumeSpecName: "kube-api-access-cnfbk") pod "fa1d0c57-f336-41c0-b65d-c4d8928b734c" (UID: "fa1d0c57-f336-41c0-b65d-c4d8928b734c"). InnerVolumeSpecName "kube-api-access-cnfbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.169039 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.169259 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnfbk\" (UniqueName: \"kubernetes.io/projected/fa1d0c57-f336-41c0-b65d-c4d8928b734c-kube-api-access-cnfbk\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.182075 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa1d0c57-f336-41c0-b65d-c4d8928b734c" (UID: "fa1d0c57-f336-41c0-b65d-c4d8928b734c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.270640 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c57-f336-41c0-b65d-c4d8928b734c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.437280 4903 generic.go:334] "Generic (PLEG): container finished" podID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerID="b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502" exitCode=0 Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.437338 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksl4b" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.437354 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerDied","Data":"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502"} Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.437385 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksl4b" event={"ID":"fa1d0c57-f336-41c0-b65d-c4d8928b734c","Type":"ContainerDied","Data":"60ff4c3925de75c9504420b76c9e07b01f586ccb680dafac58fa7405d5e86288"} Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.437408 4903 scope.go:117] "RemoveContainer" containerID="b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.441843 4903 generic.go:334] "Generic (PLEG): container finished" podID="761dc476-edb0-4778-a1a5-6e81140737bc" containerID="98b24543b82d47c74fd99124a4f9d62944172ee58335174566657f2357454b48" exitCode=0 Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.441923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" event={"ID":"761dc476-edb0-4778-a1a5-6e81140737bc","Type":"ContainerDied","Data":"98b24543b82d47c74fd99124a4f9d62944172ee58335174566657f2357454b48"} Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.441967 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" event={"ID":"761dc476-edb0-4778-a1a5-6e81140737bc","Type":"ContainerStarted","Data":"6963f0675e5071dbfbf31ee3a28860b33053e150cbdc79bf15c402ce67a64253"} Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.537301 4903 scope.go:117] "RemoveContainer" containerID="7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.549630 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.564596 4903 scope.go:117] "RemoveContainer" containerID="3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.566892 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ksl4b"] Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.609883 4903 scope.go:117] "RemoveContainer" containerID="b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502" Dec 02 23:45:01 crc kubenswrapper[4903]: E1202 23:45:01.611086 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502\": container with ID starting with b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502 not found: ID does not exist" containerID="b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.611127 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502"} err="failed to get container status \"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502\": rpc error: code = NotFound desc = could not find container \"b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502\": container with ID starting with b4e4201c8db9fbfd9224522afa85b3a4cd804d4edabd9168a859fbba5a8eb502 not found: ID does not exist" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.611165 4903 scope.go:117] "RemoveContainer" containerID="7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e" Dec 02 23:45:01 crc kubenswrapper[4903]: E1202 23:45:01.611496 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e\": container with ID starting with 7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e not found: ID does not exist" containerID="7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.611527 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e"} err="failed to get container status \"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e\": rpc error: code = NotFound desc = could not find container \"7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e\": container with ID starting with 7c2667a8c5b5348ae0303d12f4a1d777bd11843321e9175470363f8016943b1e not found: ID does not exist" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.611545 4903 scope.go:117] "RemoveContainer" containerID="3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53" Dec 02 23:45:01 crc kubenswrapper[4903]: E1202 23:45:01.611908 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53\": container with ID starting with 3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53 not found: ID does not exist" containerID="3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.611969 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53"} err="failed to get container status \"3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53\": rpc error: code = NotFound desc = could not find container \"3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53\": container with ID starting with 3110685857d4a64d5951a8f92bd25bb8b7ce3ff4d87c557a2e48c52cee1e0b53 not found: ID does not exist" Dec 02 23:45:01 crc kubenswrapper[4903]: I1202 23:45:01.642038 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" path="/var/lib/kubelet/pods/fa1d0c57-f336-41c0-b65d-c4d8928b734c/volumes" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.785583 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.825823 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt6kh\" (UniqueName: \"kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh\") pod \"761dc476-edb0-4778-a1a5-6e81140737bc\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.825914 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume\") pod \"761dc476-edb0-4778-a1a5-6e81140737bc\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.825960 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume\") pod \"761dc476-edb0-4778-a1a5-6e81140737bc\" (UID: \"761dc476-edb0-4778-a1a5-6e81140737bc\") " Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.826683 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume" (OuterVolumeSpecName: "config-volume") pod "761dc476-edb0-4778-a1a5-6e81140737bc" (UID: "761dc476-edb0-4778-a1a5-6e81140737bc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.835905 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh" (OuterVolumeSpecName: "kube-api-access-nt6kh") pod "761dc476-edb0-4778-a1a5-6e81140737bc" (UID: "761dc476-edb0-4778-a1a5-6e81140737bc"). InnerVolumeSpecName "kube-api-access-nt6kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.836161 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "761dc476-edb0-4778-a1a5-6e81140737bc" (UID: "761dc476-edb0-4778-a1a5-6e81140737bc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.928481 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt6kh\" (UniqueName: \"kubernetes.io/projected/761dc476-edb0-4778-a1a5-6e81140737bc-kube-api-access-nt6kh\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.928531 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/761dc476-edb0-4778-a1a5-6e81140737bc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:02 crc kubenswrapper[4903]: I1202 23:45:02.928548 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/761dc476-edb0-4778-a1a5-6e81140737bc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 23:45:03 crc kubenswrapper[4903]: I1202 23:45:03.475140 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" event={"ID":"761dc476-edb0-4778-a1a5-6e81140737bc","Type":"ContainerDied","Data":"6963f0675e5071dbfbf31ee3a28860b33053e150cbdc79bf15c402ce67a64253"} Dec 02 23:45:03 crc kubenswrapper[4903]: I1202 23:45:03.475648 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6963f0675e5071dbfbf31ee3a28860b33053e150cbdc79bf15c402ce67a64253" Dec 02 23:45:03 crc kubenswrapper[4903]: I1202 23:45:03.475233 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4" Dec 02 23:45:03 crc kubenswrapper[4903]: I1202 23:45:03.906198 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv"] Dec 02 23:45:03 crc kubenswrapper[4903]: I1202 23:45:03.922853 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411940-vlzpv"] Dec 02 23:45:05 crc kubenswrapper[4903]: I1202 23:45:05.627364 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1b463e-15a8-425d-872e-d1f9683747c2" path="/var/lib/kubelet/pods/dd1b463e-15a8-425d-872e-d1f9683747c2/volumes" Dec 02 23:45:23 crc kubenswrapper[4903]: I1202 23:45:23.069953 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:45:23 crc kubenswrapper[4903]: I1202 23:45:23.070611 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.958696 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 02 23:45:31 crc kubenswrapper[4903]: E1202 23:45:31.959669 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761dc476-edb0-4778-a1a5-6e81140737bc" containerName="collect-profiles" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.959685 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="761dc476-edb0-4778-a1a5-6e81140737bc" containerName="collect-profiles" Dec 02 23:45:31 crc kubenswrapper[4903]: E1202 23:45:31.959709 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="extract-utilities" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.959718 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="extract-utilities" Dec 02 23:45:31 crc kubenswrapper[4903]: E1202 23:45:31.959737 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="extract-content" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.959745 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="extract-content" Dec 02 23:45:31 crc kubenswrapper[4903]: E1202 23:45:31.959760 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="registry-server" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.959768 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="registry-server" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.959997 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1d0c57-f336-41c0-b65d-c4d8928b734c" containerName="registry-server" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.960039 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="761dc476-edb0-4778-a1a5-6e81140737bc" containerName="collect-profiles" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.961296 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.963926 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 02 23:45:31 crc kubenswrapper[4903]: I1202 23:45:31.981897 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.044978 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.048053 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.059601 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.061237 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086344 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086520 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086611 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-run\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086673 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086702 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-dev\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086726 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-sys\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086748 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086775 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-lib-modules\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086856 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086881 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086903 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bqd\" (UniqueName: \"kubernetes.io/projected/864e9292-f08c-493e-8110-5ec88083fde2-kube-api-access-k2bqd\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.086957 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-scripts\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.128113 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.130408 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.136328 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.153096 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196525 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jn8\" (UniqueName: \"kubernetes.io/projected/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-kube-api-access-v8jn8\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196618 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-run\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196790 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196827 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196889 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-dev\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196915 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-run\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-sys\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.196983 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197028 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197124 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197153 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197191 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-lib-modules\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197281 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197385 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197419 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197445 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197496 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bqd\" (UniqueName: \"kubernetes.io/projected/864e9292-f08c-493e-8110-5ec88083fde2-kube-api-access-k2bqd\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197531 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-scripts\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197725 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.197994 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-dev\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198050 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-sys\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198109 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198167 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198193 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-lib-modules\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198213 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198313 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198389 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198410 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-nvme\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198452 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-dev\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198792 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198813 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198916 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-sys\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.198953 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.199011 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.199966 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/864e9292-f08c-493e-8110-5ec88083fde2-run\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.206872 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data-custom\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.207245 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-config-data\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.210310 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-scripts\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.213200 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e9292-f08c-493e-8110-5ec88083fde2-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.222070 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bqd\" (UniqueName: \"kubernetes.io/projected/864e9292-f08c-493e-8110-5ec88083fde2-kube-api-access-k2bqd\") pod \"cinder-backup-0\" (UID: \"864e9292-f08c-493e-8110-5ec88083fde2\") " pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.283049 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301675 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301766 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301815 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301817 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301914 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301958 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-dev\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.301983 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302006 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302042 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302054 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-dev\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302064 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302125 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302330 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302406 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-sys\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302427 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302486 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302524 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-sys\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302560 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302561 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302626 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jn8\" (UniqueName: \"kubernetes.io/projected/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-kube-api-access-v8jn8\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302736 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302782 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7jf\" (UniqueName: \"kubernetes.io/projected/ce4112ef-fcb6-4722-acd0-45bf409867a7-kube-api-access-2m7jf\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302810 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302851 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-run\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302892 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302930 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302944 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302978 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.302989 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303044 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303203 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303211 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303230 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303160 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-run\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303319 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303420 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.303460 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.306528 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.307361 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.308522 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.309373 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.326457 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jn8\" (UniqueName: \"kubernetes.io/projected/55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8-kube-api-access-v8jn8\") pod \"cinder-volume-nfs-0\" (UID: \"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8\") " pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.365440 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.414764 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415303 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415330 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415359 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415426 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415473 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415543 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415560 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415582 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415676 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415728 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415779 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7jf\" (UniqueName: \"kubernetes.io/projected/ce4112ef-fcb6-4722-acd0-45bf409867a7-kube-api-access-2m7jf\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415814 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.415966 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.423701 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.423764 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.423834 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.423878 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.423953 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.424024 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.424152 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.424295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.424575 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce4112ef-fcb6-4722-acd0-45bf409867a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.426323 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.428135 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.433373 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.433933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4112ef-fcb6-4722-acd0-45bf409867a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.439215 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7jf\" (UniqueName: \"kubernetes.io/projected/ce4112ef-fcb6-4722-acd0-45bf409867a7-kube-api-access-2m7jf\") pod \"cinder-volume-nfs-2-0\" (UID: \"ce4112ef-fcb6-4722-acd0-45bf409867a7\") " pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.452699 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:32 crc kubenswrapper[4903]: I1202 23:45:32.975999 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.058299 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.849773 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 02 23:45:33 crc kubenswrapper[4903]: W1202 23:45:33.868627 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f68daa_fcb3_4c4f_8ae4_84af8ac8b5a8.slice/crio-b3be6e7f11e1d39085e36b11c0ff43ef77ef52a1dfea6e5bc71643c2782851b8 WatchSource:0}: Error finding container b3be6e7f11e1d39085e36b11c0ff43ef77ef52a1dfea6e5bc71643c2782851b8: Status 404 returned error can't find the container with id b3be6e7f11e1d39085e36b11c0ff43ef77ef52a1dfea6e5bc71643c2782851b8 Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.947548 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"ce4112ef-fcb6-4722-acd0-45bf409867a7","Type":"ContainerStarted","Data":"52dd4282e9e1625f6d6811d9f382fe4b3831358907271f9db103970db99fc134"} Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.947755 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"ce4112ef-fcb6-4722-acd0-45bf409867a7","Type":"ContainerStarted","Data":"9008a5f4ce79d0e7f8a1cbddbfa1b45d1527b1a0591148bf26546024f80b9c19"} Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.950559 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"864e9292-f08c-493e-8110-5ec88083fde2","Type":"ContainerStarted","Data":"b1a27a3317886283ad5387d51d0624baedf0212d152b2f39d05b8212ddaa1969"} Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.950603 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"864e9292-f08c-493e-8110-5ec88083fde2","Type":"ContainerStarted","Data":"7664630a40e277725280782e36d5f38607b84ca70419b8bead66cb6010fadc09"} Dec 02 23:45:33 crc kubenswrapper[4903]: I1202 23:45:33.952824 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8","Type":"ContainerStarted","Data":"b3be6e7f11e1d39085e36b11c0ff43ef77ef52a1dfea6e5bc71643c2782851b8"} Dec 02 23:45:34 crc kubenswrapper[4903]: I1202 23:45:34.963628 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"ce4112ef-fcb6-4722-acd0-45bf409867a7","Type":"ContainerStarted","Data":"baf0230b11fcfb761aa7015f28360d73022a7fbf38d772d4fa94463b205e9d20"} Dec 02 23:45:34 crc kubenswrapper[4903]: I1202 23:45:34.965732 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"864e9292-f08c-493e-8110-5ec88083fde2","Type":"ContainerStarted","Data":"ab06987a3291bf91b192f79f14f72a4b74c8c6129c0b550fe7a583d0cabac47c"} Dec 02 23:45:34 crc kubenswrapper[4903]: I1202 23:45:34.967369 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8","Type":"ContainerStarted","Data":"ea0faf50acf319434baafefc08fd0cfc7d44450fc1702c667b8e285792790907"} Dec 02 23:45:34 crc kubenswrapper[4903]: I1202 23:45:34.967404 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8","Type":"ContainerStarted","Data":"a4c0789b679cc74e5e6ad3d840a26d34437f28561c739f3cc27ece20cd2226af"} Dec 02 23:45:34 crc kubenswrapper[4903]: I1202 23:45:34.995758 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.746749273 podStartE2EDuration="2.995737481s" podCreationTimestamp="2025-12-02 23:45:32 +0000 UTC" firstStartedPulling="2025-12-02 23:45:33.119298514 +0000 UTC m=+2871.827852807" lastFinishedPulling="2025-12-02 23:45:33.368286722 +0000 UTC m=+2872.076841015" observedRunningTime="2025-12-02 23:45:34.988039685 +0000 UTC m=+2873.696593968" watchObservedRunningTime="2025-12-02 23:45:34.995737481 +0000 UTC m=+2873.704291764" Dec 02 23:45:35 crc kubenswrapper[4903]: I1202 23:45:35.021518 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.819632866 podStartE2EDuration="4.021499224s" podCreationTimestamp="2025-12-02 23:45:31 +0000 UTC" firstStartedPulling="2025-12-02 23:45:32.94778084 +0000 UTC m=+2871.656335123" lastFinishedPulling="2025-12-02 23:45:33.149647198 +0000 UTC m=+2871.858201481" observedRunningTime="2025-12-02 23:45:35.01762393 +0000 UTC m=+2873.726178243" watchObservedRunningTime="2025-12-02 23:45:35.021499224 +0000 UTC m=+2873.730053507" Dec 02 23:45:35 crc kubenswrapper[4903]: I1202 23:45:35.043799 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.043781402 podStartE2EDuration="3.043781402s" podCreationTimestamp="2025-12-02 23:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:45:35.037609793 +0000 UTC m=+2873.746164076" watchObservedRunningTime="2025-12-02 23:45:35.043781402 +0000 UTC m=+2873.752335685" Dec 02 23:45:37 crc kubenswrapper[4903]: I1202 23:45:37.283991 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 02 23:45:37 crc kubenswrapper[4903]: I1202 23:45:37.366361 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:37 crc kubenswrapper[4903]: I1202 23:45:37.453834 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:42 crc kubenswrapper[4903]: I1202 23:45:42.465107 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 02 23:45:42 crc kubenswrapper[4903]: I1202 23:45:42.601100 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Dec 02 23:45:42 crc kubenswrapper[4903]: I1202 23:45:42.701545 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Dec 02 23:45:53 crc kubenswrapper[4903]: I1202 23:45:53.069819 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:45:53 crc kubenswrapper[4903]: I1202 23:45:53.070420 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:46:04 crc kubenswrapper[4903]: I1202 23:46:04.802141 4903 scope.go:117] "RemoveContainer" containerID="6f081c20905982f479963eebae3e8a9cad72aace1dd769dfa88e3bf606f82a7d" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.069771 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.070483 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.070549 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.071730 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.071817 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" gracePeriod=600 Dec 02 23:46:23 crc kubenswrapper[4903]: E1202 23:46:23.198976 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.569067 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" exitCode=0 Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.569151 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181"} Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.569212 4903 scope.go:117] "RemoveContainer" containerID="334e7f1c386c5773dacb43719b040835a12d8d082105a64c0872016257ba6a7f" Dec 02 23:46:23 crc kubenswrapper[4903]: I1202 23:46:23.570383 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:46:23 crc kubenswrapper[4903]: E1202 23:46:23.571081 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.449303 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.451859 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="prometheus" containerID="cri-o://e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe" gracePeriod=600 Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.451943 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="config-reloader" containerID="cri-o://8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446" gracePeriod=600 Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.451943 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="thanos-sidecar" containerID="cri-o://39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef" gracePeriod=600 Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.687015 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.133:9090/-/ready\": dial tcp 10.217.0.133:9090: connect: connection refused" Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.720236 4903 generic.go:334] "Generic (PLEG): container finished" podID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerID="39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef" exitCode=0 Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.720272 4903 generic.go:334] "Generic (PLEG): container finished" podID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerID="e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe" exitCode=0 Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.720300 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerDied","Data":"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef"} Dec 02 23:46:35 crc kubenswrapper[4903]: I1202 23:46:35.720329 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerDied","Data":"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe"} Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.542019 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.613713 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:46:36 crc kubenswrapper[4903]: E1202 23:46:36.613927 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713135 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713178 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713199 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713242 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5qq\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713281 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713322 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713806 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.713973 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.714026 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.714057 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\" (UID: \"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545\") " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.714108 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.714593 4903 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.721261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out" (OuterVolumeSpecName: "config-out") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.721427 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq" (OuterVolumeSpecName: "kube-api-access-jf5qq") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "kube-api-access-jf5qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.722336 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.722784 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.722908 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config" (OuterVolumeSpecName: "config") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.723547 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.730587 4903 generic.go:334] "Generic (PLEG): container finished" podID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerID="8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446" exitCode=0 Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.730774 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerDied","Data":"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446"} Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.730933 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545","Type":"ContainerDied","Data":"5805e491e5e822df3dbaafd155c8ca1195f35a6c04a0983274b30fcf7ab033a0"} Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.730964 4903 scope.go:117] "RemoveContainer" containerID="39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.730873 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.737761 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.737806 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.744202 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "pvc-791bde7a-5990-4917-baab-d6fca61a913e". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817442 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") on node \"crc\" " Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817480 4903 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817491 4903 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817502 4903 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817514 4903 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817524 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817533 4903 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817542 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf5qq\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-kube-api-access-jf5qq\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.817552 4903 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.830939 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config" (OuterVolumeSpecName: "web-config") pod "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" (UID: "2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.851167 4903 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.855710 4903 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-791bde7a-5990-4917-baab-d6fca61a913e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e") on node "crc" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.919718 4903 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.919747 4903 reconciler_common.go:293] "Volume detached for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") on node \"crc\" DevicePath \"\"" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.924380 4903 scope.go:117] "RemoveContainer" containerID="8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.945213 4903 scope.go:117] "RemoveContainer" containerID="e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.963222 4903 scope.go:117] "RemoveContainer" containerID="b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.984327 4903 scope.go:117] "RemoveContainer" containerID="39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef" Dec 02 23:46:36 crc kubenswrapper[4903]: E1202 23:46:36.984806 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef\": container with ID starting with 39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef not found: ID does not exist" containerID="39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.984849 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef"} err="failed to get container status \"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef\": rpc error: code = NotFound desc = could not find container \"39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef\": container with ID starting with 39be476150955e285e007798c55b0bbe2128a59297efd1e8a254d8e30ceb94ef not found: ID does not exist" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.984880 4903 scope.go:117] "RemoveContainer" containerID="8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446" Dec 02 23:46:36 crc kubenswrapper[4903]: E1202 23:46:36.985119 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446\": container with ID starting with 8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446 not found: ID does not exist" containerID="8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.985138 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446"} err="failed to get container status \"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446\": rpc error: code = NotFound desc = could not find container \"8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446\": container with ID starting with 8358bd55dd83527eb8684510873e5c25372509a4b34c6580e1ae13a32f83d446 not found: ID does not exist" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.985150 4903 scope.go:117] "RemoveContainer" containerID="e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe" Dec 02 23:46:36 crc kubenswrapper[4903]: E1202 23:46:36.985331 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe\": container with ID starting with e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe not found: ID does not exist" containerID="e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.985384 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe"} err="failed to get container status \"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe\": rpc error: code = NotFound desc = could not find container \"e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe\": container with ID starting with e5762bb66e03348701d2a45ed223e84ee7c6590aa82a31ad97eebe991e8fd2fe not found: ID does not exist" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.985397 4903 scope.go:117] "RemoveContainer" containerID="b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba" Dec 02 23:46:36 crc kubenswrapper[4903]: E1202 23:46:36.985627 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba\": container with ID starting with b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba not found: ID does not exist" containerID="b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba" Dec 02 23:46:36 crc kubenswrapper[4903]: I1202 23:46:36.985667 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba"} err="failed to get container status \"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba\": rpc error: code = NotFound desc = could not find container \"b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba\": container with ID starting with b276be0eecc997bb7261d664fd7a40971a28064d93bb92a533e28680e43fb6ba not found: ID does not exist" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.066665 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.077403 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.106479 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:37 crc kubenswrapper[4903]: E1202 23:46:37.106959 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="prometheus" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.106978 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="prometheus" Dec 02 23:46:37 crc kubenswrapper[4903]: E1202 23:46:37.106994 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="config-reloader" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107000 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="config-reloader" Dec 02 23:46:37 crc kubenswrapper[4903]: E1202 23:46:37.107028 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="init-config-reloader" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107035 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="init-config-reloader" Dec 02 23:46:37 crc kubenswrapper[4903]: E1202 23:46:37.107057 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="thanos-sidecar" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107065 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="thanos-sidecar" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107269 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="prometheus" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107284 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="thanos-sidecar" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.107300 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" containerName="config-reloader" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.109087 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.110938 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.111106 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.111410 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.115998 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5p4fs" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.116145 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.122067 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.123843 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248324 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248385 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248408 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248494 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/61ac9e39-e707-4da2-881e-d9412cf9c136-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248512 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248541 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248583 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gttb\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-kube-api-access-7gttb\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248612 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248640 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248697 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/61ac9e39-e707-4da2-881e-d9412cf9c136-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.248731 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351006 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/61ac9e39-e707-4da2-881e-d9412cf9c136-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351050 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351089 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351141 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gttb\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-kube-api-access-7gttb\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351181 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351211 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351234 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/61ac9e39-e707-4da2-881e-d9412cf9c136-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351251 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351275 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351297 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351316 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.351797 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/61ac9e39-e707-4da2-881e-d9412cf9c136-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.355977 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.356400 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/61ac9e39-e707-4da2-881e-d9412cf9c136-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.356794 4903 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.356832 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/389fc60ed9b89584c09faa75d07c0667b0d3839786e48ead64fa3957a7dc98cb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.356882 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.356976 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-config\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.357982 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.360504 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.361292 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.361794 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/61ac9e39-e707-4da2-881e-d9412cf9c136-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.373261 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gttb\" (UniqueName: \"kubernetes.io/projected/61ac9e39-e707-4da2-881e-d9412cf9c136-kube-api-access-7gttb\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.398548 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-791bde7a-5990-4917-baab-d6fca61a913e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-791bde7a-5990-4917-baab-d6fca61a913e\") pod \"prometheus-metric-storage-0\" (UID: \"61ac9e39-e707-4da2-881e-d9412cf9c136\") " pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.463967 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.632202 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545" path="/var/lib/kubelet/pods/2cbfa4bd-d10d-4cd7-9208-fe8e1af2b545/volumes" Dec 02 23:46:37 crc kubenswrapper[4903]: I1202 23:46:37.973089 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 23:46:37 crc kubenswrapper[4903]: W1202 23:46:37.977555 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ac9e39_e707_4da2_881e_d9412cf9c136.slice/crio-65981214c6933cd6a3d747f35a7ecfe37226a46a0e0fde0435304484cf1f4092 WatchSource:0}: Error finding container 65981214c6933cd6a3d747f35a7ecfe37226a46a0e0fde0435304484cf1f4092: Status 404 returned error can't find the container with id 65981214c6933cd6a3d747f35a7ecfe37226a46a0e0fde0435304484cf1f4092 Dec 02 23:46:38 crc kubenswrapper[4903]: I1202 23:46:38.760187 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerStarted","Data":"65981214c6933cd6a3d747f35a7ecfe37226a46a0e0fde0435304484cf1f4092"} Dec 02 23:46:43 crc kubenswrapper[4903]: I1202 23:46:43.823543 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerStarted","Data":"d145fed7e5a4246558fd2fae40fc64230c2db0c9085bf408d0bdc86e38af8b9e"} Dec 02 23:46:51 crc kubenswrapper[4903]: I1202 23:46:51.627460 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:46:51 crc kubenswrapper[4903]: E1202 23:46:51.629399 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:46:51 crc kubenswrapper[4903]: I1202 23:46:51.931320 4903 generic.go:334] "Generic (PLEG): container finished" podID="61ac9e39-e707-4da2-881e-d9412cf9c136" containerID="d145fed7e5a4246558fd2fae40fc64230c2db0c9085bf408d0bdc86e38af8b9e" exitCode=0 Dec 02 23:46:51 crc kubenswrapper[4903]: I1202 23:46:51.931423 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerDied","Data":"d145fed7e5a4246558fd2fae40fc64230c2db0c9085bf408d0bdc86e38af8b9e"} Dec 02 23:46:52 crc kubenswrapper[4903]: I1202 23:46:52.945915 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerStarted","Data":"3c12ad385497c62df05ac9a4e4c37a909cd41aeefba1675fa6aa0c1024513d0f"} Dec 02 23:46:56 crc kubenswrapper[4903]: I1202 23:46:56.990717 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerStarted","Data":"f390aabd5c5c8a054e1899a7ee5040d9561a5e3299409bf3888ae68441798510"} Dec 02 23:46:56 crc kubenswrapper[4903]: I1202 23:46:56.991258 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"61ac9e39-e707-4da2-881e-d9412cf9c136","Type":"ContainerStarted","Data":"fca4d108afcced731b9068fabb427014b594f53fa5f2767f081bf9a34ef6aa3b"} Dec 02 23:46:57 crc kubenswrapper[4903]: I1202 23:46:57.020377 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.020351477 podStartE2EDuration="20.020351477s" podCreationTimestamp="2025-12-02 23:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 23:46:57.017640751 +0000 UTC m=+2955.726195054" watchObservedRunningTime="2025-12-02 23:46:57.020351477 +0000 UTC m=+2955.728905790" Dec 02 23:46:57 crc kubenswrapper[4903]: I1202 23:46:57.464267 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 23:47:05 crc kubenswrapper[4903]: I1202 23:47:05.612997 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:47:05 crc kubenswrapper[4903]: E1202 23:47:05.613622 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:47:07 crc kubenswrapper[4903]: I1202 23:47:07.464853 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 23:47:07 crc kubenswrapper[4903]: I1202 23:47:07.471617 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 23:47:08 crc kubenswrapper[4903]: I1202 23:47:08.113497 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 23:47:20 crc kubenswrapper[4903]: I1202 23:47:20.613591 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:47:20 crc kubenswrapper[4903]: E1202 23:47:20.614801 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.903275 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.906190 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.912149 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.912451 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.912922 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-h4xff" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.913145 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.947456 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979265 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979410 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979501 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979536 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979585 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knf45\" (UniqueName: \"kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979783 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:30 crc kubenswrapper[4903]: I1202 23:47:30.979850 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.083201 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.084877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.087242 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.084973 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.087581 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.087776 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.089440 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.087819 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.106019 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knf45\" (UniqueName: \"kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.106394 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.106473 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.110122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.111843 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.112082 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.113477 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.115047 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.116515 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.139501 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knf45\" (UniqueName: \"kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.172668 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.250079 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.770092 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:47:31 crc kubenswrapper[4903]: I1202 23:47:31.772796 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 23:47:32 crc kubenswrapper[4903]: I1202 23:47:32.451972 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0a6ff673-e552-4ffc-94a5-5b780fa219c0","Type":"ContainerStarted","Data":"7bf52e045ced2e8037d1c0450b6df1545e851abb844570403a5c8f8968b363cc"} Dec 02 23:47:34 crc kubenswrapper[4903]: I1202 23:47:34.613083 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:47:34 crc kubenswrapper[4903]: E1202 23:47:34.614027 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:47:42 crc kubenswrapper[4903]: I1202 23:47:42.090180 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 23:47:43 crc kubenswrapper[4903]: I1202 23:47:43.591676 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0a6ff673-e552-4ffc-94a5-5b780fa219c0","Type":"ContainerStarted","Data":"cf50554a8f4382b15dbbb7a5f5dfc6d8a0d19bdf7f160546bebe11395ec28781"} Dec 02 23:47:43 crc kubenswrapper[4903]: I1202 23:47:43.663841 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.346652512 podStartE2EDuration="14.663812138s" podCreationTimestamp="2025-12-02 23:47:29 +0000 UTC" firstStartedPulling="2025-12-02 23:47:31.769840015 +0000 UTC m=+2990.478394298" lastFinishedPulling="2025-12-02 23:47:42.086999631 +0000 UTC m=+3000.795553924" observedRunningTime="2025-12-02 23:47:43.612696392 +0000 UTC m=+3002.321250705" watchObservedRunningTime="2025-12-02 23:47:43.663812138 +0000 UTC m=+3002.372366431" Dec 02 23:47:47 crc kubenswrapper[4903]: I1202 23:47:47.613641 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:47:47 crc kubenswrapper[4903]: E1202 23:47:47.614794 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:47:59 crc kubenswrapper[4903]: I1202 23:47:59.613582 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:47:59 crc kubenswrapper[4903]: E1202 23:47:59.617060 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:48:10 crc kubenswrapper[4903]: I1202 23:48:10.612601 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:48:10 crc kubenswrapper[4903]: E1202 23:48:10.613565 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:48:24 crc kubenswrapper[4903]: I1202 23:48:24.612636 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:48:24 crc kubenswrapper[4903]: E1202 23:48:24.614061 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:48:35 crc kubenswrapper[4903]: I1202 23:48:35.612959 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:48:35 crc kubenswrapper[4903]: E1202 23:48:35.613991 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:48:50 crc kubenswrapper[4903]: I1202 23:48:50.612920 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:48:50 crc kubenswrapper[4903]: E1202 23:48:50.614130 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:05 crc kubenswrapper[4903]: I1202 23:49:05.613963 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:49:05 crc kubenswrapper[4903]: E1202 23:49:05.616263 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:18 crc kubenswrapper[4903]: I1202 23:49:18.613248 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:49:18 crc kubenswrapper[4903]: E1202 23:49:18.614398 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:31 crc kubenswrapper[4903]: I1202 23:49:31.622464 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:49:31 crc kubenswrapper[4903]: E1202 23:49:31.624343 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:43 crc kubenswrapper[4903]: I1202 23:49:43.614272 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:49:43 crc kubenswrapper[4903]: E1202 23:49:43.615579 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:55 crc kubenswrapper[4903]: I1202 23:49:55.963642 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:49:55 crc kubenswrapper[4903]: I1202 23:49:55.968783 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:55 crc kubenswrapper[4903]: I1202 23:49:55.980551 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.057982 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.058292 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.058453 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qdp\" (UniqueName: \"kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.160119 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.160224 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qdp\" (UniqueName: \"kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.160342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.160797 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.160821 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.187566 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qdp\" (UniqueName: \"kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp\") pod \"community-operators-77qrz\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.302233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:49:56 crc kubenswrapper[4903]: I1202 23:49:56.907319 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:49:57 crc kubenswrapper[4903]: I1202 23:49:57.185928 4903 generic.go:334] "Generic (PLEG): container finished" podID="681edd6b-3737-40d8-b842-7321e02185a0" containerID="542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592" exitCode=0 Dec 02 23:49:57 crc kubenswrapper[4903]: I1202 23:49:57.186025 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerDied","Data":"542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592"} Dec 02 23:49:57 crc kubenswrapper[4903]: I1202 23:49:57.186325 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerStarted","Data":"a6eeb6815e446948bf1e1c1c727f992fe47c2485c9e8e5b98d03d58cb2509b21"} Dec 02 23:49:58 crc kubenswrapper[4903]: I1202 23:49:58.199151 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerStarted","Data":"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4"} Dec 02 23:49:58 crc kubenswrapper[4903]: I1202 23:49:58.613754 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:49:58 crc kubenswrapper[4903]: E1202 23:49:58.614533 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:49:59 crc kubenswrapper[4903]: I1202 23:49:59.215160 4903 generic.go:334] "Generic (PLEG): container finished" podID="681edd6b-3737-40d8-b842-7321e02185a0" containerID="ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4" exitCode=0 Dec 02 23:49:59 crc kubenswrapper[4903]: I1202 23:49:59.215233 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerDied","Data":"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4"} Dec 02 23:50:00 crc kubenswrapper[4903]: I1202 23:50:00.227461 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerStarted","Data":"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3"} Dec 02 23:50:00 crc kubenswrapper[4903]: I1202 23:50:00.248493 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-77qrz" podStartSLOduration=2.682645055 podStartE2EDuration="5.248476787s" podCreationTimestamp="2025-12-02 23:49:55 +0000 UTC" firstStartedPulling="2025-12-02 23:49:57.188451396 +0000 UTC m=+3135.897005689" lastFinishedPulling="2025-12-02 23:49:59.754283128 +0000 UTC m=+3138.462837421" observedRunningTime="2025-12-02 23:50:00.244940281 +0000 UTC m=+3138.953494564" watchObservedRunningTime="2025-12-02 23:50:00.248476787 +0000 UTC m=+3138.957031070" Dec 02 23:50:06 crc kubenswrapper[4903]: I1202 23:50:06.303358 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:06 crc kubenswrapper[4903]: I1202 23:50:06.304062 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:06 crc kubenswrapper[4903]: I1202 23:50:06.375205 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:07 crc kubenswrapper[4903]: I1202 23:50:07.349591 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:07 crc kubenswrapper[4903]: I1202 23:50:07.397124 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:50:09 crc kubenswrapper[4903]: I1202 23:50:09.327207 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-77qrz" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="registry-server" containerID="cri-o://2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3" gracePeriod=2 Dec 02 23:50:09 crc kubenswrapper[4903]: I1202 23:50:09.613292 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:50:09 crc kubenswrapper[4903]: E1202 23:50:09.613879 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:50:09 crc kubenswrapper[4903]: I1202 23:50:09.893240 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.073999 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7qdp\" (UniqueName: \"kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp\") pod \"681edd6b-3737-40d8-b842-7321e02185a0\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.074077 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities\") pod \"681edd6b-3737-40d8-b842-7321e02185a0\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.074223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content\") pod \"681edd6b-3737-40d8-b842-7321e02185a0\" (UID: \"681edd6b-3737-40d8-b842-7321e02185a0\") " Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.074951 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities" (OuterVolumeSpecName: "utilities") pod "681edd6b-3737-40d8-b842-7321e02185a0" (UID: "681edd6b-3737-40d8-b842-7321e02185a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.095158 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.104002 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp" (OuterVolumeSpecName: "kube-api-access-m7qdp") pod "681edd6b-3737-40d8-b842-7321e02185a0" (UID: "681edd6b-3737-40d8-b842-7321e02185a0"). InnerVolumeSpecName "kube-api-access-m7qdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.119393 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "681edd6b-3737-40d8-b842-7321e02185a0" (UID: "681edd6b-3737-40d8-b842-7321e02185a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.197492 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/681edd6b-3737-40d8-b842-7321e02185a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.197555 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7qdp\" (UniqueName: \"kubernetes.io/projected/681edd6b-3737-40d8-b842-7321e02185a0-kube-api-access-m7qdp\") on node \"crc\" DevicePath \"\"" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.343058 4903 generic.go:334] "Generic (PLEG): container finished" podID="681edd6b-3737-40d8-b842-7321e02185a0" containerID="2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3" exitCode=0 Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.343126 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerDied","Data":"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3"} Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.343491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77qrz" event={"ID":"681edd6b-3737-40d8-b842-7321e02185a0","Type":"ContainerDied","Data":"a6eeb6815e446948bf1e1c1c727f992fe47c2485c9e8e5b98d03d58cb2509b21"} Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.343163 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77qrz" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.343530 4903 scope.go:117] "RemoveContainer" containerID="2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.379951 4903 scope.go:117] "RemoveContainer" containerID="ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.390406 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.403237 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-77qrz"] Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.428944 4903 scope.go:117] "RemoveContainer" containerID="542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.460683 4903 scope.go:117] "RemoveContainer" containerID="2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3" Dec 02 23:50:10 crc kubenswrapper[4903]: E1202 23:50:10.461165 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3\": container with ID starting with 2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3 not found: ID does not exist" containerID="2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.461216 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3"} err="failed to get container status \"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3\": rpc error: code = NotFound desc = could not find container \"2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3\": container with ID starting with 2cdc58ea884e769bcbde48b4f9c47cfdb593cc669b65e94b5d605a5e8b591ba3 not found: ID does not exist" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.461251 4903 scope.go:117] "RemoveContainer" containerID="ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4" Dec 02 23:50:10 crc kubenswrapper[4903]: E1202 23:50:10.461540 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4\": container with ID starting with ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4 not found: ID does not exist" containerID="ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.461576 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4"} err="failed to get container status \"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4\": rpc error: code = NotFound desc = could not find container \"ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4\": container with ID starting with ea2ade32d6a2fdfdf47a89b2939ad0dedfd41f077f37513c82b7e26f9882b9f4 not found: ID does not exist" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.461597 4903 scope.go:117] "RemoveContainer" containerID="542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592" Dec 02 23:50:10 crc kubenswrapper[4903]: E1202 23:50:10.461919 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592\": container with ID starting with 542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592 not found: ID does not exist" containerID="542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592" Dec 02 23:50:10 crc kubenswrapper[4903]: I1202 23:50:10.461964 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592"} err="failed to get container status \"542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592\": rpc error: code = NotFound desc = could not find container \"542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592\": container with ID starting with 542f5e4acc5f4a757a66f16546e13fcc2b2a902b8b3c3532ae868b492f299592 not found: ID does not exist" Dec 02 23:50:11 crc kubenswrapper[4903]: I1202 23:50:11.633812 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681edd6b-3737-40d8-b842-7321e02185a0" path="/var/lib/kubelet/pods/681edd6b-3737-40d8-b842-7321e02185a0/volumes" Dec 02 23:50:21 crc kubenswrapper[4903]: I1202 23:50:21.613323 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:50:21 crc kubenswrapper[4903]: E1202 23:50:21.614613 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:50:32 crc kubenswrapper[4903]: I1202 23:50:32.613115 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:50:32 crc kubenswrapper[4903]: E1202 23:50:32.614048 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:50:44 crc kubenswrapper[4903]: I1202 23:50:44.613120 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:50:44 crc kubenswrapper[4903]: E1202 23:50:44.614391 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:50:57 crc kubenswrapper[4903]: I1202 23:50:57.612834 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:50:57 crc kubenswrapper[4903]: E1202 23:50:57.613755 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:51:11 crc kubenswrapper[4903]: I1202 23:51:11.620494 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:51:11 crc kubenswrapper[4903]: E1202 23:51:11.621363 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:51:26 crc kubenswrapper[4903]: I1202 23:51:26.613780 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:51:27 crc kubenswrapper[4903]: I1202 23:51:27.442277 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028"} Dec 02 23:53:53 crc kubenswrapper[4903]: I1202 23:53:53.069629 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:53:53 crc kubenswrapper[4903]: I1202 23:53:53.070292 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:23 crc kubenswrapper[4903]: I1202 23:54:23.070281 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:54:23 crc kubenswrapper[4903]: I1202 23:54:23.071028 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.069443 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.072036 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.072343 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.073864 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.074146 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028" gracePeriod=600 Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.809474 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028" exitCode=0 Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.809538 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028"} Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.809893 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b"} Dec 02 23:54:53 crc kubenswrapper[4903]: I1202 23:54:53.809929 4903 scope.go:117] "RemoveContainer" containerID="9f37abbef27e9d2f37fb101505e46a6c8bdcf9247409bdb0ed2ae35a6f381181" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.899691 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:02 crc kubenswrapper[4903]: E1202 23:55:02.900862 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="extract-content" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.900879 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="extract-content" Dec 02 23:55:02 crc kubenswrapper[4903]: E1202 23:55:02.900913 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="registry-server" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.900922 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="registry-server" Dec 02 23:55:02 crc kubenswrapper[4903]: E1202 23:55:02.900945 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="extract-utilities" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.900953 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="extract-utilities" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.901230 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="681edd6b-3737-40d8-b842-7321e02185a0" containerName="registry-server" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.903168 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.913141 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.942860 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpwxx\" (UniqueName: \"kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.943012 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:02 crc kubenswrapper[4903]: I1202 23:55:02.943086 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.045633 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpwxx\" (UniqueName: \"kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.046055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.046111 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.046567 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.046790 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.066929 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpwxx\" (UniqueName: \"kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx\") pod \"redhat-operators-5qxmr\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.259007 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.744162 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:03 crc kubenswrapper[4903]: W1202 23:55:03.750332 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b66c64a_c140_4dfb_b737_4f940f8ecf17.slice/crio-7bdf005fc8a03c6b228963995f3f2baab07bece36e1825838aa6ec4ce75a0253 WatchSource:0}: Error finding container 7bdf005fc8a03c6b228963995f3f2baab07bece36e1825838aa6ec4ce75a0253: Status 404 returned error can't find the container with id 7bdf005fc8a03c6b228963995f3f2baab07bece36e1825838aa6ec4ce75a0253 Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.960754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerStarted","Data":"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12"} Dec 02 23:55:03 crc kubenswrapper[4903]: I1202 23:55:03.961800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerStarted","Data":"7bdf005fc8a03c6b228963995f3f2baab07bece36e1825838aa6ec4ce75a0253"} Dec 02 23:55:04 crc kubenswrapper[4903]: I1202 23:55:04.973750 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerID="dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12" exitCode=0 Dec 02 23:55:04 crc kubenswrapper[4903]: I1202 23:55:04.974076 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerDied","Data":"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12"} Dec 02 23:55:04 crc kubenswrapper[4903]: I1202 23:55:04.977483 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 23:55:05 crc kubenswrapper[4903]: I1202 23:55:05.984701 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerStarted","Data":"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211"} Dec 02 23:55:09 crc kubenswrapper[4903]: I1202 23:55:09.021523 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerID="46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211" exitCode=0 Dec 02 23:55:09 crc kubenswrapper[4903]: I1202 23:55:09.021733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerDied","Data":"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211"} Dec 02 23:55:11 crc kubenswrapper[4903]: I1202 23:55:11.056433 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerStarted","Data":"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762"} Dec 02 23:55:12 crc kubenswrapper[4903]: I1202 23:55:12.088308 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qxmr" podStartSLOduration=5.101097663 podStartE2EDuration="10.088289886s" podCreationTimestamp="2025-12-02 23:55:02 +0000 UTC" firstStartedPulling="2025-12-02 23:55:04.977040605 +0000 UTC m=+3443.685594918" lastFinishedPulling="2025-12-02 23:55:09.964232848 +0000 UTC m=+3448.672787141" observedRunningTime="2025-12-02 23:55:12.08268087 +0000 UTC m=+3450.791235153" watchObservedRunningTime="2025-12-02 23:55:12.088289886 +0000 UTC m=+3450.796844169" Dec 02 23:55:13 crc kubenswrapper[4903]: I1202 23:55:13.259979 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:13 crc kubenswrapper[4903]: I1202 23:55:13.260274 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:14 crc kubenswrapper[4903]: I1202 23:55:14.319477 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qxmr" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="registry-server" probeResult="failure" output=< Dec 02 23:55:14 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 02 23:55:14 crc kubenswrapper[4903]: > Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.237146 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.240611 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.271801 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.370055 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.370159 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.370290 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trht6\" (UniqueName: \"kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.471898 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.472404 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trht6\" (UniqueName: \"kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.472414 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.472441 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.472985 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.493924 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trht6\" (UniqueName: \"kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6\") pod \"redhat-marketplace-97z8x\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:17 crc kubenswrapper[4903]: I1202 23:55:17.583303 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:18 crc kubenswrapper[4903]: I1202 23:55:18.152047 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:19 crc kubenswrapper[4903]: I1202 23:55:19.147356 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerID="632506155f5116a767d2d57ecc110d56b3c8bb552cf360d7ea8feaf06bc1fdcc" exitCode=0 Dec 02 23:55:19 crc kubenswrapper[4903]: I1202 23:55:19.147424 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerDied","Data":"632506155f5116a767d2d57ecc110d56b3c8bb552cf360d7ea8feaf06bc1fdcc"} Dec 02 23:55:19 crc kubenswrapper[4903]: I1202 23:55:19.147745 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerStarted","Data":"0c8edb8b5eb5a12af66a37ca9663a13f4a64053af731b28eae73d37a4063084d"} Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.028020 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.032902 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.041687 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.131090 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6sq\" (UniqueName: \"kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.131157 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.131440 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.157003 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerStarted","Data":"0e7f17f0a189f26dc38af196506e564cbb97d3b162ebbf2da523adf064f11f74"} Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.233952 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6sq\" (UniqueName: \"kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.234008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.234109 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.234593 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.235372 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.261929 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6sq\" (UniqueName: \"kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq\") pod \"certified-operators-xgtj8\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.358842 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:20 crc kubenswrapper[4903]: I1202 23:55:20.894748 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:20 crc kubenswrapper[4903]: W1202 23:55:20.898869 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07de4ce7_1170_40d2_a107_6d03e155d295.slice/crio-bca5a0fff6ac8b3d1283c14127956a79dc79d2ffaab55cb7601fa6b68acdbcc1 WatchSource:0}: Error finding container bca5a0fff6ac8b3d1283c14127956a79dc79d2ffaab55cb7601fa6b68acdbcc1: Status 404 returned error can't find the container with id bca5a0fff6ac8b3d1283c14127956a79dc79d2ffaab55cb7601fa6b68acdbcc1 Dec 02 23:55:21 crc kubenswrapper[4903]: I1202 23:55:21.168808 4903 generic.go:334] "Generic (PLEG): container finished" podID="07de4ce7-1170-40d2-a107-6d03e155d295" containerID="0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5" exitCode=0 Dec 02 23:55:21 crc kubenswrapper[4903]: I1202 23:55:21.169011 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerDied","Data":"0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5"} Dec 02 23:55:21 crc kubenswrapper[4903]: I1202 23:55:21.169570 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerStarted","Data":"bca5a0fff6ac8b3d1283c14127956a79dc79d2ffaab55cb7601fa6b68acdbcc1"} Dec 02 23:55:21 crc kubenswrapper[4903]: I1202 23:55:21.175397 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerID="0e7f17f0a189f26dc38af196506e564cbb97d3b162ebbf2da523adf064f11f74" exitCode=0 Dec 02 23:55:21 crc kubenswrapper[4903]: I1202 23:55:21.175424 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerDied","Data":"0e7f17f0a189f26dc38af196506e564cbb97d3b162ebbf2da523adf064f11f74"} Dec 02 23:55:22 crc kubenswrapper[4903]: I1202 23:55:22.188112 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerStarted","Data":"83b90108535ecfc72b0c10b9abce116625bed9062504eaa8548768ac366f88e0"} Dec 02 23:55:22 crc kubenswrapper[4903]: I1202 23:55:22.229851 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-97z8x" podStartSLOduration=2.771760629 podStartE2EDuration="5.229827514s" podCreationTimestamp="2025-12-02 23:55:17 +0000 UTC" firstStartedPulling="2025-12-02 23:55:19.151365458 +0000 UTC m=+3457.859919781" lastFinishedPulling="2025-12-02 23:55:21.609432373 +0000 UTC m=+3460.317986666" observedRunningTime="2025-12-02 23:55:22.223249905 +0000 UTC m=+3460.931804198" watchObservedRunningTime="2025-12-02 23:55:22.229827514 +0000 UTC m=+3460.938381817" Dec 02 23:55:23 crc kubenswrapper[4903]: I1202 23:55:23.200536 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerStarted","Data":"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f"} Dec 02 23:55:23 crc kubenswrapper[4903]: I1202 23:55:23.319368 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:23 crc kubenswrapper[4903]: I1202 23:55:23.380739 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:24 crc kubenswrapper[4903]: I1202 23:55:24.211413 4903 generic.go:334] "Generic (PLEG): container finished" podID="07de4ce7-1170-40d2-a107-6d03e155d295" containerID="dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f" exitCode=0 Dec 02 23:55:24 crc kubenswrapper[4903]: I1202 23:55:24.211554 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerDied","Data":"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f"} Dec 02 23:55:25 crc kubenswrapper[4903]: I1202 23:55:25.231586 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerStarted","Data":"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72"} Dec 02 23:55:25 crc kubenswrapper[4903]: I1202 23:55:25.260309 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xgtj8" podStartSLOduration=1.645235552 podStartE2EDuration="5.260292367s" podCreationTimestamp="2025-12-02 23:55:20 +0000 UTC" firstStartedPulling="2025-12-02 23:55:21.173071394 +0000 UTC m=+3459.881625677" lastFinishedPulling="2025-12-02 23:55:24.788128199 +0000 UTC m=+3463.496682492" observedRunningTime="2025-12-02 23:55:25.257817987 +0000 UTC m=+3463.966372290" watchObservedRunningTime="2025-12-02 23:55:25.260292367 +0000 UTC m=+3463.968846660" Dec 02 23:55:26 crc kubenswrapper[4903]: I1202 23:55:26.620496 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:26 crc kubenswrapper[4903]: I1202 23:55:26.621467 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qxmr" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="registry-server" containerID="cri-o://59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762" gracePeriod=2 Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.126366 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.191072 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content\") pod \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.191364 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities\") pod \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.191426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpwxx\" (UniqueName: \"kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx\") pod \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\" (UID: \"2b66c64a-c140-4dfb-b737-4f940f8ecf17\") " Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.192147 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities" (OuterVolumeSpecName: "utilities") pod "2b66c64a-c140-4dfb-b737-4f940f8ecf17" (UID: "2b66c64a-c140-4dfb-b737-4f940f8ecf17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.200973 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx" (OuterVolumeSpecName: "kube-api-access-lpwxx") pod "2b66c64a-c140-4dfb-b737-4f940f8ecf17" (UID: "2b66c64a-c140-4dfb-b737-4f940f8ecf17"). InnerVolumeSpecName "kube-api-access-lpwxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.261409 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerID="59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762" exitCode=0 Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.261449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerDied","Data":"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762"} Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.261496 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qxmr" event={"ID":"2b66c64a-c140-4dfb-b737-4f940f8ecf17","Type":"ContainerDied","Data":"7bdf005fc8a03c6b228963995f3f2baab07bece36e1825838aa6ec4ce75a0253"} Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.261516 4903 scope.go:117] "RemoveContainer" containerID="59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.261517 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qxmr" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.284337 4903 scope.go:117] "RemoveContainer" containerID="46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.293990 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.294020 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpwxx\" (UniqueName: \"kubernetes.io/projected/2b66c64a-c140-4dfb-b737-4f940f8ecf17-kube-api-access-lpwxx\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.324808 4903 scope.go:117] "RemoveContainer" containerID="dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.341613 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b66c64a-c140-4dfb-b737-4f940f8ecf17" (UID: "2b66c64a-c140-4dfb-b737-4f940f8ecf17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.370315 4903 scope.go:117] "RemoveContainer" containerID="59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762" Dec 02 23:55:27 crc kubenswrapper[4903]: E1202 23:55:27.370712 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762\": container with ID starting with 59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762 not found: ID does not exist" containerID="59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.370759 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762"} err="failed to get container status \"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762\": rpc error: code = NotFound desc = could not find container \"59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762\": container with ID starting with 59755dda2d8f48026c3609bd501ec61229900dbc5527ed1aaeb2b332b8a27762 not found: ID does not exist" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.370788 4903 scope.go:117] "RemoveContainer" containerID="46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211" Dec 02 23:55:27 crc kubenswrapper[4903]: E1202 23:55:27.371203 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211\": container with ID starting with 46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211 not found: ID does not exist" containerID="46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.371235 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211"} err="failed to get container status \"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211\": rpc error: code = NotFound desc = could not find container \"46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211\": container with ID starting with 46df250d4ae329ff96562ed0e7806116baa75d800f61e551ec3e7b7261925211 not found: ID does not exist" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.371267 4903 scope.go:117] "RemoveContainer" containerID="dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12" Dec 02 23:55:27 crc kubenswrapper[4903]: E1202 23:55:27.371559 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12\": container with ID starting with dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12 not found: ID does not exist" containerID="dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.371604 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12"} err="failed to get container status \"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12\": rpc error: code = NotFound desc = could not find container \"dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12\": container with ID starting with dce34b6fdadd51b63a67b2ab0cc98d4c5a0480411f9999f7e3a3e9535bd58b12 not found: ID does not exist" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.395722 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b66c64a-c140-4dfb-b737-4f940f8ecf17-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.583798 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.584063 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.604820 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.630521 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qxmr"] Dec 02 23:55:27 crc kubenswrapper[4903]: I1202 23:55:27.679572 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:28 crc kubenswrapper[4903]: I1202 23:55:28.327924 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:29 crc kubenswrapper[4903]: I1202 23:55:29.637298 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" path="/var/lib/kubelet/pods/2b66c64a-c140-4dfb-b737-4f940f8ecf17/volumes" Dec 02 23:55:30 crc kubenswrapper[4903]: I1202 23:55:30.359240 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:30 crc kubenswrapper[4903]: I1202 23:55:30.359295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:30 crc kubenswrapper[4903]: I1202 23:55:30.411442 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:30 crc kubenswrapper[4903]: I1202 23:55:30.829544 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:30 crc kubenswrapper[4903]: I1202 23:55:30.829937 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-97z8x" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="registry-server" containerID="cri-o://83b90108535ecfc72b0c10b9abce116625bed9062504eaa8548768ac366f88e0" gracePeriod=2 Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.111444 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerID="83b90108535ecfc72b0c10b9abce116625bed9062504eaa8548768ac366f88e0" exitCode=0 Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.112722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerDied","Data":"83b90108535ecfc72b0c10b9abce116625bed9062504eaa8548768ac366f88e0"} Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.169930 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.342710 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.511044 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content\") pod \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.511104 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trht6\" (UniqueName: \"kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6\") pod \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.511352 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities\") pod \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\" (UID: \"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56\") " Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.511941 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities" (OuterVolumeSpecName: "utilities") pod "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" (UID: "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.512498 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.517286 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6" (OuterVolumeSpecName: "kube-api-access-trht6") pod "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" (UID: "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56"). InnerVolumeSpecName "kube-api-access-trht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.527138 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" (UID: "0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.614863 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:32 crc kubenswrapper[4903]: I1202 23:55:32.615187 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trht6\" (UniqueName: \"kubernetes.io/projected/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56-kube-api-access-trht6\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.020144 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.125387 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97z8x" event={"ID":"0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56","Type":"ContainerDied","Data":"0c8edb8b5eb5a12af66a37ca9663a13f4a64053af731b28eae73d37a4063084d"} Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.125452 4903 scope.go:117] "RemoveContainer" containerID="83b90108535ecfc72b0c10b9abce116625bed9062504eaa8548768ac366f88e0" Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.125448 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97z8x" Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.162281 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.164292 4903 scope.go:117] "RemoveContainer" containerID="0e7f17f0a189f26dc38af196506e564cbb97d3b162ebbf2da523adf064f11f74" Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.170874 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-97z8x"] Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.191715 4903 scope.go:117] "RemoveContainer" containerID="632506155f5116a767d2d57ecc110d56b3c8bb552cf360d7ea8feaf06bc1fdcc" Dec 02 23:55:33 crc kubenswrapper[4903]: I1202 23:55:33.632546 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" path="/var/lib/kubelet/pods/0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56/volumes" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.141715 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xgtj8" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="registry-server" containerID="cri-o://2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72" gracePeriod=2 Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.684956 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.763102 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities\") pod \"07de4ce7-1170-40d2-a107-6d03e155d295\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.763227 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j6sq\" (UniqueName: \"kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq\") pod \"07de4ce7-1170-40d2-a107-6d03e155d295\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.763295 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content\") pod \"07de4ce7-1170-40d2-a107-6d03e155d295\" (UID: \"07de4ce7-1170-40d2-a107-6d03e155d295\") " Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.764405 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities" (OuterVolumeSpecName: "utilities") pod "07de4ce7-1170-40d2-a107-6d03e155d295" (UID: "07de4ce7-1170-40d2-a107-6d03e155d295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.766070 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.768408 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq" (OuterVolumeSpecName: "kube-api-access-7j6sq") pod "07de4ce7-1170-40d2-a107-6d03e155d295" (UID: "07de4ce7-1170-40d2-a107-6d03e155d295"). InnerVolumeSpecName "kube-api-access-7j6sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.808626 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07de4ce7-1170-40d2-a107-6d03e155d295" (UID: "07de4ce7-1170-40d2-a107-6d03e155d295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.867882 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j6sq\" (UniqueName: \"kubernetes.io/projected/07de4ce7-1170-40d2-a107-6d03e155d295-kube-api-access-7j6sq\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:34 crc kubenswrapper[4903]: I1202 23:55:34.867915 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07de4ce7-1170-40d2-a107-6d03e155d295-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.159141 4903 generic.go:334] "Generic (PLEG): container finished" podID="07de4ce7-1170-40d2-a107-6d03e155d295" containerID="2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72" exitCode=0 Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.159207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerDied","Data":"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72"} Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.159259 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgtj8" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.159298 4903 scope.go:117] "RemoveContainer" containerID="2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.159272 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgtj8" event={"ID":"07de4ce7-1170-40d2-a107-6d03e155d295","Type":"ContainerDied","Data":"bca5a0fff6ac8b3d1283c14127956a79dc79d2ffaab55cb7601fa6b68acdbcc1"} Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.191242 4903 scope.go:117] "RemoveContainer" containerID="dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.229217 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.235127 4903 scope.go:117] "RemoveContainer" containerID="0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.245476 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xgtj8"] Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.303072 4903 scope.go:117] "RemoveContainer" containerID="2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72" Dec 02 23:55:35 crc kubenswrapper[4903]: E1202 23:55:35.303588 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72\": container with ID starting with 2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72 not found: ID does not exist" containerID="2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.303679 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72"} err="failed to get container status \"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72\": rpc error: code = NotFound desc = could not find container \"2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72\": container with ID starting with 2492dd2e1c3b1d4bdc7ce58f9ac3923496f5d3463d8fc1ae7ddaa4604d256f72 not found: ID does not exist" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.303724 4903 scope.go:117] "RemoveContainer" containerID="dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f" Dec 02 23:55:35 crc kubenswrapper[4903]: E1202 23:55:35.304388 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f\": container with ID starting with dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f not found: ID does not exist" containerID="dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.304591 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f"} err="failed to get container status \"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f\": rpc error: code = NotFound desc = could not find container \"dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f\": container with ID starting with dccb798a1b10fe04c1e447b1433585cc3ce8a82654e0f5ae787e374d2f957e2f not found: ID does not exist" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.304777 4903 scope.go:117] "RemoveContainer" containerID="0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5" Dec 02 23:55:35 crc kubenswrapper[4903]: E1202 23:55:35.305319 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5\": container with ID starting with 0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5 not found: ID does not exist" containerID="0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.305460 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5"} err="failed to get container status \"0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5\": rpc error: code = NotFound desc = could not find container \"0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5\": container with ID starting with 0f939b85b0be0bfff1130c9b1094645c83388cc8f20e273b1791e5a9a9a178c5 not found: ID does not exist" Dec 02 23:55:35 crc kubenswrapper[4903]: I1202 23:55:35.631703 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" path="/var/lib/kubelet/pods/07de4ce7-1170-40d2-a107-6d03e155d295/volumes" Dec 02 23:56:34 crc kubenswrapper[4903]: E1202 23:56:34.089900 4903 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:47402->38.102.83.39:43931: write tcp 38.102.83.39:47402->38.102.83.39:43931: write: connection reset by peer Dec 02 23:56:53 crc kubenswrapper[4903]: I1202 23:56:53.069476 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:56:53 crc kubenswrapper[4903]: I1202 23:56:53.070149 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:57:23 crc kubenswrapper[4903]: I1202 23:57:23.069616 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:57:23 crc kubenswrapper[4903]: I1202 23:57:23.070859 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.070039 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.070640 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.070749 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.071867 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.071961 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" gracePeriod=600 Dec 02 23:57:53 crc kubenswrapper[4903]: E1202 23:57:53.201245 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.866520 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" exitCode=0 Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.866605 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b"} Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.867196 4903 scope.go:117] "RemoveContainer" containerID="a41c8c681fb2e583e9d4b67028fb1c72207cb1da52192e4d6d4a326f52eb2028" Dec 02 23:57:53 crc kubenswrapper[4903]: I1202 23:57:53.867864 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:57:53 crc kubenswrapper[4903]: E1202 23:57:53.868745 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:58:05 crc kubenswrapper[4903]: I1202 23:58:05.613138 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:58:05 crc kubenswrapper[4903]: E1202 23:58:05.613951 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:58:17 crc kubenswrapper[4903]: I1202 23:58:17.613534 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:58:17 crc kubenswrapper[4903]: E1202 23:58:17.614729 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:58:31 crc kubenswrapper[4903]: I1202 23:58:31.620687 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:58:31 crc kubenswrapper[4903]: E1202 23:58:31.621610 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:58:44 crc kubenswrapper[4903]: I1202 23:58:44.613397 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:58:44 crc kubenswrapper[4903]: E1202 23:58:44.614263 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:58:57 crc kubenswrapper[4903]: I1202 23:58:57.612424 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:58:57 crc kubenswrapper[4903]: E1202 23:58:57.613200 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:59:10 crc kubenswrapper[4903]: I1202 23:59:10.613072 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:59:10 crc kubenswrapper[4903]: E1202 23:59:10.613999 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:59:21 crc kubenswrapper[4903]: I1202 23:59:21.632609 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:59:21 crc kubenswrapper[4903]: E1202 23:59:21.633798 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:59:36 crc kubenswrapper[4903]: I1202 23:59:36.612533 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:59:36 crc kubenswrapper[4903]: E1202 23:59:36.613606 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 02 23:59:50 crc kubenswrapper[4903]: I1202 23:59:50.612645 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 02 23:59:50 crc kubenswrapper[4903]: E1202 23:59:50.614463 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.185106 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29412000-79k4j"] Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186245 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186262 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186288 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186296 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186316 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186324 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186336 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186345 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186355 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186363 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="extract-utilities" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186374 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186380 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186395 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186403 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186419 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186426 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: E1203 00:00:00.186446 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186454 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="extract-content" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186703 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="07de4ce7-1170-40d2-a107-6d03e155d295" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186742 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b66c64a-c140-4dfb-b737-4f940f8ecf17" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.186754 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e999fbd-3f15-4851-bfb0-5cb1cd3fbb56" containerName="registry-server" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.187555 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.192118 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.192410 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.207587 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.209896 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.216263 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.217349 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.219953 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29412000-xgfnd"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.231117 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.231578 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29412000-lw5h5"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.236313 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.244715 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.244987 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.248352 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29412000-xgfnd"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.285956 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29412000-79k4j"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.305481 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29412000-lw5h5"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.323010 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2"] Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392206 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392264 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392300 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392323 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392352 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392368 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392414 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392451 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm6qv\" (UniqueName: \"kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392478 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwwmz\" (UniqueName: \"kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392496 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz7m\" (UniqueName: \"kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392520 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csb24\" (UniqueName: \"kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392541 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.392559 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.494862 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.494982 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495049 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495096 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495150 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495184 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495267 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495340 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm6qv\" (UniqueName: \"kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495396 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwwmz\" (UniqueName: \"kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495438 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz7m\" (UniqueName: \"kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495478 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csb24\" (UniqueName: \"kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495524 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.495557 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.496456 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.497850 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.502812 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.504697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.509367 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.512833 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.513883 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.518637 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.519074 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.523234 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm6qv\" (UniqueName: \"kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv\") pod \"collect-profiles-29412000-9tfc2\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.524306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csb24\" (UniqueName: \"kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24\") pod \"nova-cell1-db-purge-29412000-lw5h5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.525248 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwwmz\" (UniqueName: \"kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz\") pod \"image-pruner-29412000-79k4j\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.530222 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz7m\" (UniqueName: \"kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m\") pod \"nova-cell0-db-purge-29412000-xgfnd\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.546420 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.591273 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.602117 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:00 crc kubenswrapper[4903]: I1203 00:00:00.817324 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:01 crc kubenswrapper[4903]: W1203 00:00:01.065778 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb38d1f_26ce_448f_8e8e_d694f3e98edd.slice/crio-18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82 WatchSource:0}: Error finding container 18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82: Status 404 returned error can't find the container with id 18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82 Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.068234 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2"] Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.193104 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29412000-xgfnd"] Dec 03 00:00:01 crc kubenswrapper[4903]: W1203 00:00:01.275686 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2131c673_5399_4093_92fd_c63b4ce2a8a5.slice/crio-381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9 WatchSource:0}: Error finding container 381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9: Status 404 returned error can't find the container with id 381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9 Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.276210 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29412000-lw5h5"] Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.358534 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" event={"ID":"2131c673-5399-4093-92fd-c63b4ce2a8a5","Type":"ContainerStarted","Data":"381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9"} Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.359743 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" event={"ID":"f8499f90-daef-4c46-90ef-36aba9557136","Type":"ContainerStarted","Data":"6cc94f9aec53da7d97684ddc63d8a23d151fc4e440627c8fab230a7281f23688"} Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.360723 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" event={"ID":"7cb38d1f-26ce-448f-8e8e-d694f3e98edd","Type":"ContainerStarted","Data":"73effbfb9985c1b1d0d0c81a1e44b9e6ab68d1edee0be538bcc7f6e8809a59b9"} Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.360742 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" event={"ID":"7cb38d1f-26ce-448f-8e8e-d694f3e98edd","Type":"ContainerStarted","Data":"18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82"} Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.368193 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29412000-79k4j"] Dec 03 00:00:01 crc kubenswrapper[4903]: I1203 00:00:01.643020 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" podStartSLOduration=1.643004601 podStartE2EDuration="1.643004601s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:01.376920622 +0000 UTC m=+3740.085474905" watchObservedRunningTime="2025-12-03 00:00:01.643004601 +0000 UTC m=+3740.351558884" Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.371251 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" event={"ID":"f8499f90-daef-4c46-90ef-36aba9557136","Type":"ContainerStarted","Data":"073c92e6b6c1e58c377952e643e6192fd690907d87c8967f9463ad3188f655f3"} Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.373620 4903 generic.go:334] "Generic (PLEG): container finished" podID="7cb38d1f-26ce-448f-8e8e-d694f3e98edd" containerID="73effbfb9985c1b1d0d0c81a1e44b9e6ab68d1edee0be538bcc7f6e8809a59b9" exitCode=0 Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.373680 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" event={"ID":"7cb38d1f-26ce-448f-8e8e-d694f3e98edd","Type":"ContainerDied","Data":"73effbfb9985c1b1d0d0c81a1e44b9e6ab68d1edee0be538bcc7f6e8809a59b9"} Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.375368 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" event={"ID":"2131c673-5399-4093-92fd-c63b4ce2a8a5","Type":"ContainerStarted","Data":"c744c9236c7b74cb6c93cf33e9481e3fba4cf54105853a18d5c36f2454a2857c"} Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.385382 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-79k4j" event={"ID":"db350f83-0e78-45b9-bf28-1fe5709e0378","Type":"ContainerStarted","Data":"0f87dfcddccfa04870d6875e33920a215bc9ac81ac936126ab935af803c7be01"} Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.385440 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-79k4j" event={"ID":"db350f83-0e78-45b9-bf28-1fe5709e0378","Type":"ContainerStarted","Data":"d5487ff453256baf8579ca20eb9a07610c09a54b9b6da9d5ba30fae197a2e88c"} Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.403361 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" podStartSLOduration=2.403339942 podStartE2EDuration="2.403339942s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.398839684 +0000 UTC m=+3741.107393957" watchObservedRunningTime="2025-12-03 00:00:02.403339942 +0000 UTC m=+3741.111894225" Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.443773 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" podStartSLOduration=2.4437399490000002 podStartE2EDuration="2.443739949s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.434511285 +0000 UTC m=+3741.143065588" watchObservedRunningTime="2025-12-03 00:00:02.443739949 +0000 UTC m=+3741.152294232" Dec 03 00:00:02 crc kubenswrapper[4903]: I1203 00:00:02.460184 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29412000-79k4j" podStartSLOduration=2.460167266 podStartE2EDuration="2.460167266s" podCreationTimestamp="2025-12-03 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:00:02.458850684 +0000 UTC m=+3741.167404967" watchObservedRunningTime="2025-12-03 00:00:02.460167266 +0000 UTC m=+3741.168721549" Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.415471 4903 generic.go:334] "Generic (PLEG): container finished" podID="db350f83-0e78-45b9-bf28-1fe5709e0378" containerID="0f87dfcddccfa04870d6875e33920a215bc9ac81ac936126ab935af803c7be01" exitCode=0 Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.419626 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-79k4j" event={"ID":"db350f83-0e78-45b9-bf28-1fe5709e0378","Type":"ContainerDied","Data":"0f87dfcddccfa04870d6875e33920a215bc9ac81ac936126ab935af803c7be01"} Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.865798 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.981628 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume\") pod \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.981893 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume\") pod \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.982018 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm6qv\" (UniqueName: \"kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv\") pod \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\" (UID: \"7cb38d1f-26ce-448f-8e8e-d694f3e98edd\") " Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.983182 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cb38d1f-26ce-448f-8e8e-d694f3e98edd" (UID: "7cb38d1f-26ce-448f-8e8e-d694f3e98edd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.983850 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:03 crc kubenswrapper[4903]: I1203 00:00:03.987807 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cb38d1f-26ce-448f-8e8e-d694f3e98edd" (UID: "7cb38d1f-26ce-448f-8e8e-d694f3e98edd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.004715 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv" (OuterVolumeSpecName: "kube-api-access-nm6qv") pod "7cb38d1f-26ce-448f-8e8e-d694f3e98edd" (UID: "7cb38d1f-26ce-448f-8e8e-d694f3e98edd"). InnerVolumeSpecName "kube-api-access-nm6qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.086782 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm6qv\" (UniqueName: \"kubernetes.io/projected/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-kube-api-access-nm6qv\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.086840 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb38d1f-26ce-448f-8e8e-d694f3e98edd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.429874 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.431198 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2" event={"ID":"7cb38d1f-26ce-448f-8e8e-d694f3e98edd","Type":"ContainerDied","Data":"18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82"} Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.431253 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f9776bec4198deb13f1c98932928b30713722c67d69a37edfd8c291163cc82" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.490764 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm"] Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.500428 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411955-phxmm"] Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.613853 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:00:04 crc kubenswrapper[4903]: E1203 00:00:04.614151 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.799566 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.904851 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca\") pod \"db350f83-0e78-45b9-bf28-1fe5709e0378\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.905028 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwwmz\" (UniqueName: \"kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz\") pod \"db350f83-0e78-45b9-bf28-1fe5709e0378\" (UID: \"db350f83-0e78-45b9-bf28-1fe5709e0378\") " Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.905601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca" (OuterVolumeSpecName: "serviceca") pod "db350f83-0e78-45b9-bf28-1fe5709e0378" (UID: "db350f83-0e78-45b9-bf28-1fe5709e0378"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:00:04 crc kubenswrapper[4903]: I1203 00:00:04.914240 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz" (OuterVolumeSpecName: "kube-api-access-qwwmz") pod "db350f83-0e78-45b9-bf28-1fe5709e0378" (UID: "db350f83-0e78-45b9-bf28-1fe5709e0378"). InnerVolumeSpecName "kube-api-access-qwwmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.007026 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwwmz\" (UniqueName: \"kubernetes.io/projected/db350f83-0e78-45b9-bf28-1fe5709e0378-kube-api-access-qwwmz\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.007062 4903 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db350f83-0e78-45b9-bf28-1fe5709e0378-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.452718 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29412000-79k4j" event={"ID":"db350f83-0e78-45b9-bf28-1fe5709e0378","Type":"ContainerDied","Data":"d5487ff453256baf8579ca20eb9a07610c09a54b9b6da9d5ba30fae197a2e88c"} Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.452757 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5487ff453256baf8579ca20eb9a07610c09a54b9b6da9d5ba30fae197a2e88c" Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.452821 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29412000-79k4j" Dec 03 00:00:05 crc kubenswrapper[4903]: I1203 00:00:05.626504 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a754468c-293c-4429-bbcf-3ecd9d1a87ee" path="/var/lib/kubelet/pods/a754468c-293c-4429-bbcf-3ecd9d1a87ee/volumes" Dec 03 00:00:05 crc kubenswrapper[4903]: E1203 00:00:05.649491 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb350f83_0e78_45b9_bf28_1fe5709e0378.slice\": RecentStats: unable to find data in memory cache]" Dec 03 00:00:07 crc kubenswrapper[4903]: I1203 00:00:07.475664 4903 generic.go:334] "Generic (PLEG): container finished" podID="2131c673-5399-4093-92fd-c63b4ce2a8a5" containerID="c744c9236c7b74cb6c93cf33e9481e3fba4cf54105853a18d5c36f2454a2857c" exitCode=0 Dec 03 00:00:07 crc kubenswrapper[4903]: I1203 00:00:07.475703 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" event={"ID":"2131c673-5399-4093-92fd-c63b4ce2a8a5","Type":"ContainerDied","Data":"c744c9236c7b74cb6c93cf33e9481e3fba4cf54105853a18d5c36f2454a2857c"} Dec 03 00:00:07 crc kubenswrapper[4903]: I1203 00:00:07.478030 4903 generic.go:334] "Generic (PLEG): container finished" podID="f8499f90-daef-4c46-90ef-36aba9557136" containerID="073c92e6b6c1e58c377952e643e6192fd690907d87c8967f9463ad3188f655f3" exitCode=0 Dec 03 00:00:07 crc kubenswrapper[4903]: I1203 00:00:07.478066 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" event={"ID":"f8499f90-daef-4c46-90ef-36aba9557136","Type":"ContainerDied","Data":"073c92e6b6c1e58c377952e643e6192fd690907d87c8967f9463ad3188f655f3"} Dec 03 00:00:08 crc kubenswrapper[4903]: I1203 00:00:08.977885 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:08 crc kubenswrapper[4903]: I1203 00:00:08.984417 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.120620 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data\") pod \"f8499f90-daef-4c46-90ef-36aba9557136\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.120740 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csb24\" (UniqueName: \"kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24\") pod \"2131c673-5399-4093-92fd-c63b4ce2a8a5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.120849 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts\") pod \"f8499f90-daef-4c46-90ef-36aba9557136\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.120871 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts\") pod \"2131c673-5399-4093-92fd-c63b4ce2a8a5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.120926 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle\") pod \"2131c673-5399-4093-92fd-c63b4ce2a8a5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.121013 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcz7m\" (UniqueName: \"kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m\") pod \"f8499f90-daef-4c46-90ef-36aba9557136\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.121105 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data\") pod \"2131c673-5399-4093-92fd-c63b4ce2a8a5\" (UID: \"2131c673-5399-4093-92fd-c63b4ce2a8a5\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.121127 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle\") pod \"f8499f90-daef-4c46-90ef-36aba9557136\" (UID: \"f8499f90-daef-4c46-90ef-36aba9557136\") " Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.126600 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts" (OuterVolumeSpecName: "scripts") pod "2131c673-5399-4093-92fd-c63b4ce2a8a5" (UID: "2131c673-5399-4093-92fd-c63b4ce2a8a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.132801 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts" (OuterVolumeSpecName: "scripts") pod "f8499f90-daef-4c46-90ef-36aba9557136" (UID: "f8499f90-daef-4c46-90ef-36aba9557136"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.134842 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m" (OuterVolumeSpecName: "kube-api-access-hcz7m") pod "f8499f90-daef-4c46-90ef-36aba9557136" (UID: "f8499f90-daef-4c46-90ef-36aba9557136"). InnerVolumeSpecName "kube-api-access-hcz7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.148817 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24" (OuterVolumeSpecName: "kube-api-access-csb24") pod "2131c673-5399-4093-92fd-c63b4ce2a8a5" (UID: "2131c673-5399-4093-92fd-c63b4ce2a8a5"). InnerVolumeSpecName "kube-api-access-csb24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.159003 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2131c673-5399-4093-92fd-c63b4ce2a8a5" (UID: "2131c673-5399-4093-92fd-c63b4ce2a8a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.162354 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8499f90-daef-4c46-90ef-36aba9557136" (UID: "f8499f90-daef-4c46-90ef-36aba9557136"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.165790 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data" (OuterVolumeSpecName: "config-data") pod "2131c673-5399-4093-92fd-c63b4ce2a8a5" (UID: "2131c673-5399-4093-92fd-c63b4ce2a8a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.183890 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data" (OuterVolumeSpecName: "config-data") pod "f8499f90-daef-4c46-90ef-36aba9557136" (UID: "f8499f90-daef-4c46-90ef-36aba9557136"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223371 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcz7m\" (UniqueName: \"kubernetes.io/projected/f8499f90-daef-4c46-90ef-36aba9557136-kube-api-access-hcz7m\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223407 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223416 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223424 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223435 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csb24\" (UniqueName: \"kubernetes.io/projected/2131c673-5399-4093-92fd-c63b4ce2a8a5-kube-api-access-csb24\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223445 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8499f90-daef-4c46-90ef-36aba9557136-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223453 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.223461 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c673-5399-4093-92fd-c63b4ce2a8a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.517087 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" event={"ID":"2131c673-5399-4093-92fd-c63b4ce2a8a5","Type":"ContainerDied","Data":"381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9"} Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.517373 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="381551c6d35dd56dac8632e74f4e31eae0176b90b0c384d163e4c6bbc7b445c9" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.517177 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29412000-lw5h5" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.520298 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" event={"ID":"f8499f90-daef-4c46-90ef-36aba9557136","Type":"ContainerDied","Data":"6cc94f9aec53da7d97684ddc63d8a23d151fc4e440627c8fab230a7281f23688"} Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.520359 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc94f9aec53da7d97684ddc63d8a23d151fc4e440627c8fab230a7281f23688" Dec 03 00:00:09 crc kubenswrapper[4903]: I1203 00:00:09.520461 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29412000-xgfnd" Dec 03 00:00:18 crc kubenswrapper[4903]: I1203 00:00:18.612827 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:00:18 crc kubenswrapper[4903]: E1203 00:00:18.614074 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:00:32 crc kubenswrapper[4903]: I1203 00:00:32.612942 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:00:32 crc kubenswrapper[4903]: E1203 00:00:32.614210 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:00:47 crc kubenswrapper[4903]: I1203 00:00:47.613518 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:00:47 crc kubenswrapper[4903]: E1203 00:00:47.614251 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.174825 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29412001-9nk2t"] Dec 03 00:01:00 crc kubenswrapper[4903]: E1203 00:01:00.175868 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2131c673-5399-4093-92fd-c63b4ce2a8a5" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.175882 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2131c673-5399-4093-92fd-c63b4ce2a8a5" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: E1203 00:01:00.175902 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db350f83-0e78-45b9-bf28-1fe5709e0378" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.175910 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="db350f83-0e78-45b9-bf28-1fe5709e0378" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4903]: E1203 00:01:00.175928 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb38d1f-26ce-448f-8e8e-d694f3e98edd" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.175936 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb38d1f-26ce-448f-8e8e-d694f3e98edd" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4903]: E1203 00:01:00.175947 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8499f90-daef-4c46-90ef-36aba9557136" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.175955 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8499f90-daef-4c46-90ef-36aba9557136" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.176210 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2131c673-5399-4093-92fd-c63b4ce2a8a5" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.176246 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8499f90-daef-4c46-90ef-36aba9557136" containerName="nova-manage" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.176263 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb38d1f-26ce-448f-8e8e-d694f3e98edd" containerName="collect-profiles" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.176279 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="db350f83-0e78-45b9-bf28-1fe5709e0378" containerName="image-pruner" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.177097 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.181487 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.183669 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29412001-pbz9p"] Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.184986 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.194724 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412001-6287z"] Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.195981 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.218866 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29412001-pbz9p"] Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.218932 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29412001-9nk2t"] Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256203 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256254 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2f8\" (UniqueName: \"kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256348 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbj85\" (UniqueName: \"kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256459 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256492 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256516 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256547 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gtvc\" (UniqueName: \"kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256606 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256688 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256790 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.256823 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.298976 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412001-6287z"] Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359184 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359315 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359344 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2f8\" (UniqueName: \"kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359412 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359476 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbj85\" (UniqueName: \"kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359563 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359593 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359616 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359642 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gtvc\" (UniqueName: \"kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.359725 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.368459 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.368822 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.369015 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.369132 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.369156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.369726 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.378981 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.379636 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.382336 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.383511 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gtvc\" (UniqueName: \"kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc\") pod \"keystone-cron-29412001-6287z\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.384042 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbj85\" (UniqueName: \"kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85\") pod \"cinder-db-purge-29412001-pbz9p\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.384671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2f8\" (UniqueName: \"kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8\") pod \"glance-db-purge-29412001-9nk2t\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.560103 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.584233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:00 crc kubenswrapper[4903]: I1203 00:01:00.602674 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:01 crc kubenswrapper[4903]: I1203 00:01:01.112909 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29412001-pbz9p"] Dec 03 00:01:01 crc kubenswrapper[4903]: I1203 00:01:01.128633 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29412001-9nk2t"] Dec 03 00:01:01 crc kubenswrapper[4903]: I1203 00:01:01.189375 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-pbz9p" event={"ID":"334ce527-c86f-4991-bb5a-bb31f27acee1","Type":"ContainerStarted","Data":"331e762350dd6b65d8987be211ef43dc5bfe83525c9c479052807c17779040fb"} Dec 03 00:01:01 crc kubenswrapper[4903]: I1203 00:01:01.191531 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-9nk2t" event={"ID":"ce1d9817-bff6-40a4-bc9b-fcbd1510739c","Type":"ContainerStarted","Data":"731d31643ffa9b13a2011af1ce0eb7bdb38275d53d310cb3fe6a8e518372a89c"} Dec 03 00:01:01 crc kubenswrapper[4903]: I1203 00:01:01.195056 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412001-6287z"] Dec 03 00:01:01 crc kubenswrapper[4903]: W1203 00:01:01.224457 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc445dbad_15ca_4171_ac03_0fd37dbdd474.slice/crio-8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8 WatchSource:0}: Error finding container 8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8: Status 404 returned error can't find the container with id 8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8 Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.208603 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-9nk2t" event={"ID":"ce1d9817-bff6-40a4-bc9b-fcbd1510739c","Type":"ContainerStarted","Data":"26f272f9bbc6b20d0b6218a5f80cf119285b2cb767d75403e51ca6403f19e509"} Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.211354 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-6287z" event={"ID":"c445dbad-15ca-4171-ac03-0fd37dbdd474","Type":"ContainerStarted","Data":"2a857650e214842d5c7090d8af06e80a57a2b89a7fa51c65387aebf390a6fd8e"} Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.211418 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-6287z" event={"ID":"c445dbad-15ca-4171-ac03-0fd37dbdd474","Type":"ContainerStarted","Data":"8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8"} Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.225796 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-pbz9p" event={"ID":"334ce527-c86f-4991-bb5a-bb31f27acee1","Type":"ContainerStarted","Data":"b8248a1cdd8acc6e80649abcf37df38632c813e52b21ca60a6e43fc0d83f5455"} Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.234845 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29412001-9nk2t" podStartSLOduration=2.234822588 podStartE2EDuration="2.234822588s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.230912693 +0000 UTC m=+3800.939466976" watchObservedRunningTime="2025-12-03 00:01:02.234822588 +0000 UTC m=+3800.943376881" Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.253408 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412001-6287z" podStartSLOduration=2.2533900559999998 podStartE2EDuration="2.253390056s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.250335232 +0000 UTC m=+3800.958889525" watchObservedRunningTime="2025-12-03 00:01:02.253390056 +0000 UTC m=+3800.961944339" Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.274738 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29412001-pbz9p" podStartSLOduration=2.274628379 podStartE2EDuration="2.274628379s" podCreationTimestamp="2025-12-03 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:01:02.264815112 +0000 UTC m=+3800.973369415" watchObservedRunningTime="2025-12-03 00:01:02.274628379 +0000 UTC m=+3800.983182652" Dec 03 00:01:02 crc kubenswrapper[4903]: I1203 00:01:02.613292 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:01:02 crc kubenswrapper[4903]: E1203 00:01:02.614219 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:01:04 crc kubenswrapper[4903]: I1203 00:01:04.255153 4903 generic.go:334] "Generic (PLEG): container finished" podID="334ce527-c86f-4991-bb5a-bb31f27acee1" containerID="b8248a1cdd8acc6e80649abcf37df38632c813e52b21ca60a6e43fc0d83f5455" exitCode=0 Dec 03 00:01:04 crc kubenswrapper[4903]: I1203 00:01:04.255230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-pbz9p" event={"ID":"334ce527-c86f-4991-bb5a-bb31f27acee1","Type":"ContainerDied","Data":"b8248a1cdd8acc6e80649abcf37df38632c813e52b21ca60a6e43fc0d83f5455"} Dec 03 00:01:04 crc kubenswrapper[4903]: I1203 00:01:04.258062 4903 generic.go:334] "Generic (PLEG): container finished" podID="ce1d9817-bff6-40a4-bc9b-fcbd1510739c" containerID="26f272f9bbc6b20d0b6218a5f80cf119285b2cb767d75403e51ca6403f19e509" exitCode=0 Dec 03 00:01:04 crc kubenswrapper[4903]: I1203 00:01:04.258097 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-9nk2t" event={"ID":"ce1d9817-bff6-40a4-bc9b-fcbd1510739c","Type":"ContainerDied","Data":"26f272f9bbc6b20d0b6218a5f80cf119285b2cb767d75403e51ca6403f19e509"} Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.272989 4903 generic.go:334] "Generic (PLEG): container finished" podID="c445dbad-15ca-4171-ac03-0fd37dbdd474" containerID="2a857650e214842d5c7090d8af06e80a57a2b89a7fa51c65387aebf390a6fd8e" exitCode=0 Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.273092 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-6287z" event={"ID":"c445dbad-15ca-4171-ac03-0fd37dbdd474","Type":"ContainerDied","Data":"2a857650e214842d5c7090d8af06e80a57a2b89a7fa51c65387aebf390a6fd8e"} Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.347159 4903 scope.go:117] "RemoveContainer" containerID="edb9fe5ba3f738d56f330d22bafef890b67b04bcfcc67bee024fa0a03b655ff6" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.847896 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.852634 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.927666 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data\") pod \"334ce527-c86f-4991-bb5a-bb31f27acee1\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.927717 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data\") pod \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.927773 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbj85\" (UniqueName: \"kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85\") pod \"334ce527-c86f-4991-bb5a-bb31f27acee1\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.928237 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2f8\" (UniqueName: \"kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8\") pod \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.928318 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle\") pod \"334ce527-c86f-4991-bb5a-bb31f27acee1\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.928387 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data\") pod \"334ce527-c86f-4991-bb5a-bb31f27acee1\" (UID: \"334ce527-c86f-4991-bb5a-bb31f27acee1\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.928418 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle\") pod \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.928436 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data\") pod \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\" (UID: \"ce1d9817-bff6-40a4-bc9b-fcbd1510739c\") " Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.933572 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8" (OuterVolumeSpecName: "kube-api-access-tl2f8") pod "ce1d9817-bff6-40a4-bc9b-fcbd1510739c" (UID: "ce1d9817-bff6-40a4-bc9b-fcbd1510739c"). InnerVolumeSpecName "kube-api-access-tl2f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.934130 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "ce1d9817-bff6-40a4-bc9b-fcbd1510739c" (UID: "ce1d9817-bff6-40a4-bc9b-fcbd1510739c"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.934253 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "334ce527-c86f-4991-bb5a-bb31f27acee1" (UID: "334ce527-c86f-4991-bb5a-bb31f27acee1"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.935881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85" (OuterVolumeSpecName: "kube-api-access-lbj85") pod "334ce527-c86f-4991-bb5a-bb31f27acee1" (UID: "334ce527-c86f-4991-bb5a-bb31f27acee1"). InnerVolumeSpecName "kube-api-access-lbj85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.958839 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data" (OuterVolumeSpecName: "config-data") pod "ce1d9817-bff6-40a4-bc9b-fcbd1510739c" (UID: "ce1d9817-bff6-40a4-bc9b-fcbd1510739c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.966365 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data" (OuterVolumeSpecName: "config-data") pod "334ce527-c86f-4991-bb5a-bb31f27acee1" (UID: "334ce527-c86f-4991-bb5a-bb31f27acee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.970242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce1d9817-bff6-40a4-bc9b-fcbd1510739c" (UID: "ce1d9817-bff6-40a4-bc9b-fcbd1510739c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:05 crc kubenswrapper[4903]: I1203 00:01:05.994585 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "334ce527-c86f-4991-bb5a-bb31f27acee1" (UID: "334ce527-c86f-4991-bb5a-bb31f27acee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031021 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbj85\" (UniqueName: \"kubernetes.io/projected/334ce527-c86f-4991-bb5a-bb31f27acee1-kube-api-access-lbj85\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031051 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2f8\" (UniqueName: \"kubernetes.io/projected/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-kube-api-access-tl2f8\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031062 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031070 4903 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031078 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031088 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031096 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ce527-c86f-4991-bb5a-bb31f27acee1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.031105 4903 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/ce1d9817-bff6-40a4-bc9b-fcbd1510739c-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.285910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29412001-pbz9p" event={"ID":"334ce527-c86f-4991-bb5a-bb31f27acee1","Type":"ContainerDied","Data":"331e762350dd6b65d8987be211ef43dc5bfe83525c9c479052807c17779040fb"} Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.286174 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331e762350dd6b65d8987be211ef43dc5bfe83525c9c479052807c17779040fb" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.285951 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29412001-pbz9p" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.290319 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29412001-9nk2t" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.291931 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29412001-9nk2t" event={"ID":"ce1d9817-bff6-40a4-bc9b-fcbd1510739c","Type":"ContainerDied","Data":"731d31643ffa9b13a2011af1ce0eb7bdb38275d53d310cb3fe6a8e518372a89c"} Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.292018 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="731d31643ffa9b13a2011af1ce0eb7bdb38275d53d310cb3fe6a8e518372a89c" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.728053 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.848754 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gtvc\" (UniqueName: \"kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc\") pod \"c445dbad-15ca-4171-ac03-0fd37dbdd474\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.848830 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data\") pod \"c445dbad-15ca-4171-ac03-0fd37dbdd474\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.848897 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys\") pod \"c445dbad-15ca-4171-ac03-0fd37dbdd474\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.848971 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle\") pod \"c445dbad-15ca-4171-ac03-0fd37dbdd474\" (UID: \"c445dbad-15ca-4171-ac03-0fd37dbdd474\") " Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.856836 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c445dbad-15ca-4171-ac03-0fd37dbdd474" (UID: "c445dbad-15ca-4171-ac03-0fd37dbdd474"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.859472 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc" (OuterVolumeSpecName: "kube-api-access-4gtvc") pod "c445dbad-15ca-4171-ac03-0fd37dbdd474" (UID: "c445dbad-15ca-4171-ac03-0fd37dbdd474"). InnerVolumeSpecName "kube-api-access-4gtvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.904723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c445dbad-15ca-4171-ac03-0fd37dbdd474" (UID: "c445dbad-15ca-4171-ac03-0fd37dbdd474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.951346 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.951382 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gtvc\" (UniqueName: \"kubernetes.io/projected/c445dbad-15ca-4171-ac03-0fd37dbdd474-kube-api-access-4gtvc\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.951392 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:06 crc kubenswrapper[4903]: I1203 00:01:06.954010 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data" (OuterVolumeSpecName: "config-data") pod "c445dbad-15ca-4171-ac03-0fd37dbdd474" (UID: "c445dbad-15ca-4171-ac03-0fd37dbdd474"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:01:07 crc kubenswrapper[4903]: I1203 00:01:07.053729 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c445dbad-15ca-4171-ac03-0fd37dbdd474-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:07 crc kubenswrapper[4903]: I1203 00:01:07.309989 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412001-6287z" event={"ID":"c445dbad-15ca-4171-ac03-0fd37dbdd474","Type":"ContainerDied","Data":"8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8"} Dec 03 00:01:07 crc kubenswrapper[4903]: I1203 00:01:07.310036 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8754888b0e1504469c1a05d273870d5505fc23e9aa9eddd01b9984c0c75ab4e8" Dec 03 00:01:07 crc kubenswrapper[4903]: I1203 00:01:07.310018 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412001-6287z" Dec 03 00:01:07 crc kubenswrapper[4903]: E1203 00:01:07.540291 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc445dbad_15ca_4171_ac03_0fd37dbdd474.slice\": RecentStats: unable to find data in memory cache]" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.706491 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:14 crc kubenswrapper[4903]: E1203 00:01:14.707519 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334ce527-c86f-4991-bb5a-bb31f27acee1" containerName="cinder-db-purge" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.707540 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="334ce527-c86f-4991-bb5a-bb31f27acee1" containerName="cinder-db-purge" Dec 03 00:01:14 crc kubenswrapper[4903]: E1203 00:01:14.707576 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c445dbad-15ca-4171-ac03-0fd37dbdd474" containerName="keystone-cron" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.707584 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c445dbad-15ca-4171-ac03-0fd37dbdd474" containerName="keystone-cron" Dec 03 00:01:14 crc kubenswrapper[4903]: E1203 00:01:14.707594 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1d9817-bff6-40a4-bc9b-fcbd1510739c" containerName="glance-dbpurge" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.707602 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1d9817-bff6-40a4-bc9b-fcbd1510739c" containerName="glance-dbpurge" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.707985 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="334ce527-c86f-4991-bb5a-bb31f27acee1" containerName="cinder-db-purge" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.708011 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c445dbad-15ca-4171-ac03-0fd37dbdd474" containerName="keystone-cron" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.708029 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1d9817-bff6-40a4-bc9b-fcbd1510739c" containerName="glance-dbpurge" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.709887 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.719758 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.830917 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.831366 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55d5d\" (UniqueName: \"kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.831507 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.933324 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55d5d\" (UniqueName: \"kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.933481 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.933548 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.934243 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.934251 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:14 crc kubenswrapper[4903]: I1203 00:01:14.954394 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55d5d\" (UniqueName: \"kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d\") pod \"community-operators-h94x7\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:15 crc kubenswrapper[4903]: I1203 00:01:15.050036 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:15 crc kubenswrapper[4903]: I1203 00:01:15.670158 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:16 crc kubenswrapper[4903]: I1203 00:01:16.402872 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerID="f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3" exitCode=0 Dec 03 00:01:16 crc kubenswrapper[4903]: I1203 00:01:16.402919 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerDied","Data":"f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3"} Dec 03 00:01:16 crc kubenswrapper[4903]: I1203 00:01:16.403171 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerStarted","Data":"de4cf5394bc05e99adba649aababb6c76f180106ffaee73c39a228abdfeb195b"} Dec 03 00:01:16 crc kubenswrapper[4903]: I1203 00:01:16.404911 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:01:17 crc kubenswrapper[4903]: I1203 00:01:17.614470 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:01:17 crc kubenswrapper[4903]: E1203 00:01:17.615200 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:01:18 crc kubenswrapper[4903]: I1203 00:01:18.422482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerStarted","Data":"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907"} Dec 03 00:01:19 crc kubenswrapper[4903]: I1203 00:01:19.434604 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerID="b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907" exitCode=0 Dec 03 00:01:19 crc kubenswrapper[4903]: I1203 00:01:19.434729 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerDied","Data":"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907"} Dec 03 00:01:20 crc kubenswrapper[4903]: I1203 00:01:20.446763 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerStarted","Data":"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911"} Dec 03 00:01:20 crc kubenswrapper[4903]: I1203 00:01:20.470831 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h94x7" podStartSLOduration=2.783027793 podStartE2EDuration="6.470808012s" podCreationTimestamp="2025-12-03 00:01:14 +0000 UTC" firstStartedPulling="2025-12-03 00:01:16.404591444 +0000 UTC m=+3815.113145737" lastFinishedPulling="2025-12-03 00:01:20.092371633 +0000 UTC m=+3818.800925956" observedRunningTime="2025-12-03 00:01:20.464758386 +0000 UTC m=+3819.173312709" watchObservedRunningTime="2025-12-03 00:01:20.470808012 +0000 UTC m=+3819.179362315" Dec 03 00:01:25 crc kubenswrapper[4903]: I1203 00:01:25.051157 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:25 crc kubenswrapper[4903]: I1203 00:01:25.051509 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:25 crc kubenswrapper[4903]: I1203 00:01:25.116700 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:25 crc kubenswrapper[4903]: I1203 00:01:25.555250 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:25 crc kubenswrapper[4903]: I1203 00:01:25.610378 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:27 crc kubenswrapper[4903]: I1203 00:01:27.528390 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h94x7" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="registry-server" containerID="cri-o://6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911" gracePeriod=2 Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.002778 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.153841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55d5d\" (UniqueName: \"kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d\") pod \"bd0ff361-72b4-418a-b605-dffc67174ebe\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.153929 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content\") pod \"bd0ff361-72b4-418a-b605-dffc67174ebe\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.154234 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities\") pod \"bd0ff361-72b4-418a-b605-dffc67174ebe\" (UID: \"bd0ff361-72b4-418a-b605-dffc67174ebe\") " Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.168045 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d" (OuterVolumeSpecName: "kube-api-access-55d5d") pod "bd0ff361-72b4-418a-b605-dffc67174ebe" (UID: "bd0ff361-72b4-418a-b605-dffc67174ebe"). InnerVolumeSpecName "kube-api-access-55d5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.172941 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities" (OuterVolumeSpecName: "utilities") pod "bd0ff361-72b4-418a-b605-dffc67174ebe" (UID: "bd0ff361-72b4-418a-b605-dffc67174ebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.236022 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd0ff361-72b4-418a-b605-dffc67174ebe" (UID: "bd0ff361-72b4-418a-b605-dffc67174ebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.261100 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.261165 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55d5d\" (UniqueName: \"kubernetes.io/projected/bd0ff361-72b4-418a-b605-dffc67174ebe-kube-api-access-55d5d\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.261178 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0ff361-72b4-418a-b605-dffc67174ebe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.537607 4903 generic.go:334] "Generic (PLEG): container finished" podID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerID="6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911" exitCode=0 Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.537935 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerDied","Data":"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911"} Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.537968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h94x7" event={"ID":"bd0ff361-72b4-418a-b605-dffc67174ebe","Type":"ContainerDied","Data":"de4cf5394bc05e99adba649aababb6c76f180106ffaee73c39a228abdfeb195b"} Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.537987 4903 scope.go:117] "RemoveContainer" containerID="6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.538141 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h94x7" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.573758 4903 scope.go:117] "RemoveContainer" containerID="b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.586235 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.597645 4903 scope.go:117] "RemoveContainer" containerID="f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.602320 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h94x7"] Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.612582 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:01:28 crc kubenswrapper[4903]: E1203 00:01:28.613049 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.657030 4903 scope.go:117] "RemoveContainer" containerID="6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911" Dec 03 00:01:28 crc kubenswrapper[4903]: E1203 00:01:28.657736 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911\": container with ID starting with 6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911 not found: ID does not exist" containerID="6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.657817 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911"} err="failed to get container status \"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911\": rpc error: code = NotFound desc = could not find container \"6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911\": container with ID starting with 6c4a5a8c5529a4979c4c4c641f8abe8c39c87148026c5076df8729d10b50b911 not found: ID does not exist" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.657887 4903 scope.go:117] "RemoveContainer" containerID="b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907" Dec 03 00:01:28 crc kubenswrapper[4903]: E1203 00:01:28.658285 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907\": container with ID starting with b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907 not found: ID does not exist" containerID="b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.658328 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907"} err="failed to get container status \"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907\": rpc error: code = NotFound desc = could not find container \"b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907\": container with ID starting with b1a8890ab7a8ca70511a73b84e0eacc37e630036f3084aa57150324656026907 not found: ID does not exist" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.658353 4903 scope.go:117] "RemoveContainer" containerID="f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3" Dec 03 00:01:28 crc kubenswrapper[4903]: E1203 00:01:28.658689 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3\": container with ID starting with f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3 not found: ID does not exist" containerID="f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3" Dec 03 00:01:28 crc kubenswrapper[4903]: I1203 00:01:28.658799 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3"} err="failed to get container status \"f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3\": rpc error: code = NotFound desc = could not find container \"f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3\": container with ID starting with f2d4fa09ef0d7f96131650e40e507b1883726d75c7962ced2e7a21725715a7f3 not found: ID does not exist" Dec 03 00:01:29 crc kubenswrapper[4903]: I1203 00:01:29.629033 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" path="/var/lib/kubelet/pods/bd0ff361-72b4-418a-b605-dffc67174ebe/volumes" Dec 03 00:01:43 crc kubenswrapper[4903]: I1203 00:01:43.612026 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:01:43 crc kubenswrapper[4903]: E1203 00:01:43.613495 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:01:54 crc kubenswrapper[4903]: I1203 00:01:54.613540 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:01:54 crc kubenswrapper[4903]: E1203 00:01:54.616192 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:02:06 crc kubenswrapper[4903]: I1203 00:02:06.613401 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:02:06 crc kubenswrapper[4903]: E1203 00:02:06.614312 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:02:17 crc kubenswrapper[4903]: I1203 00:02:17.614226 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:02:17 crc kubenswrapper[4903]: E1203 00:02:17.616328 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:02:30 crc kubenswrapper[4903]: I1203 00:02:30.613447 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:02:30 crc kubenswrapper[4903]: E1203 00:02:30.614441 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:02:41 crc kubenswrapper[4903]: I1203 00:02:41.633156 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:02:41 crc kubenswrapper[4903]: E1203 00:02:41.638864 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:02:53 crc kubenswrapper[4903]: I1203 00:02:53.613207 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:02:54 crc kubenswrapper[4903]: I1203 00:02:54.532519 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20"} Dec 03 00:04:53 crc kubenswrapper[4903]: I1203 00:04:53.070178 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:04:53 crc kubenswrapper[4903]: I1203 00:04:53.070949 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:05:23 crc kubenswrapper[4903]: I1203 00:05:23.069779 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:05:23 crc kubenswrapper[4903]: I1203 00:05:23.070613 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.449222 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:25 crc kubenswrapper[4903]: E1203 00:05:25.450364 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="extract-content" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.450382 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="extract-content" Dec 03 00:05:25 crc kubenswrapper[4903]: E1203 00:05:25.450409 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="extract-utilities" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.450418 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="extract-utilities" Dec 03 00:05:25 crc kubenswrapper[4903]: E1203 00:05:25.450458 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="registry-server" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.450468 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="registry-server" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.450742 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0ff361-72b4-418a-b605-dffc67174ebe" containerName="registry-server" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.452630 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.463103 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.553999 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.554041 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.554215 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccvv8\" (UniqueName: \"kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.658281 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.658351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.658393 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccvv8\" (UniqueName: \"kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.658950 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:25 crc kubenswrapper[4903]: I1203 00:05:25.659017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:26 crc kubenswrapper[4903]: I1203 00:05:26.072819 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccvv8\" (UniqueName: \"kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8\") pod \"redhat-operators-k4pwp\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:26 crc kubenswrapper[4903]: I1203 00:05:26.085447 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:26 crc kubenswrapper[4903]: I1203 00:05:26.648878 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:27 crc kubenswrapper[4903]: I1203 00:05:27.078205 4903 generic.go:334] "Generic (PLEG): container finished" podID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerID="6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d" exitCode=0 Dec 03 00:05:27 crc kubenswrapper[4903]: I1203 00:05:27.078267 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerDied","Data":"6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d"} Dec 03 00:05:27 crc kubenswrapper[4903]: I1203 00:05:27.078545 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerStarted","Data":"e0aaffb20f9b0572ce80f39c5a80d1462275c8b3aedffec723535426869bde1f"} Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.033962 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.052868 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.054515 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.110712 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.111028 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.111228 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq4kx\" (UniqueName: \"kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.213564 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.213633 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.213714 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq4kx\" (UniqueName: \"kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.214493 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.214767 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.279101 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq4kx\" (UniqueName: \"kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx\") pod \"redhat-marketplace-pmtg9\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.381589 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:28 crc kubenswrapper[4903]: I1203 00:05:28.900582 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:29 crc kubenswrapper[4903]: I1203 00:05:29.104085 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerStarted","Data":"852252ef75e965592dc4982789340992837464f12b13f0371bc212affe4f6989"} Dec 03 00:05:29 crc kubenswrapper[4903]: I1203 00:05:29.108247 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerStarted","Data":"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07"} Dec 03 00:05:30 crc kubenswrapper[4903]: I1203 00:05:30.117882 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerID="28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0" exitCode=0 Dec 03 00:05:30 crc kubenswrapper[4903]: I1203 00:05:30.117986 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerDied","Data":"28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0"} Dec 03 00:05:31 crc kubenswrapper[4903]: I1203 00:05:31.148892 4903 generic.go:334] "Generic (PLEG): container finished" podID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerID="48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07" exitCode=0 Dec 03 00:05:31 crc kubenswrapper[4903]: I1203 00:05:31.148970 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerDied","Data":"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07"} Dec 03 00:05:32 crc kubenswrapper[4903]: I1203 00:05:32.171794 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerID="f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d" exitCode=0 Dec 03 00:05:32 crc kubenswrapper[4903]: I1203 00:05:32.171946 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerDied","Data":"f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d"} Dec 03 00:05:33 crc kubenswrapper[4903]: I1203 00:05:33.183087 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerStarted","Data":"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01"} Dec 03 00:05:33 crc kubenswrapper[4903]: I1203 00:05:33.185719 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerStarted","Data":"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e"} Dec 03 00:05:33 crc kubenswrapper[4903]: I1203 00:05:33.207854 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pmtg9" podStartSLOduration=3.4490691030000002 podStartE2EDuration="6.207834976s" podCreationTimestamp="2025-12-03 00:05:27 +0000 UTC" firstStartedPulling="2025-12-03 00:05:30.120204331 +0000 UTC m=+4068.828758614" lastFinishedPulling="2025-12-03 00:05:32.878970204 +0000 UTC m=+4071.587524487" observedRunningTime="2025-12-03 00:05:33.204981977 +0000 UTC m=+4071.913536260" watchObservedRunningTime="2025-12-03 00:05:33.207834976 +0000 UTC m=+4071.916389259" Dec 03 00:05:33 crc kubenswrapper[4903]: I1203 00:05:33.248629 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4pwp" podStartSLOduration=3.268591394 podStartE2EDuration="8.24860163s" podCreationTimestamp="2025-12-03 00:05:25 +0000 UTC" firstStartedPulling="2025-12-03 00:05:27.08163493 +0000 UTC m=+4065.790189233" lastFinishedPulling="2025-12-03 00:05:32.061645146 +0000 UTC m=+4070.770199469" observedRunningTime="2025-12-03 00:05:33.235216077 +0000 UTC m=+4071.943770370" watchObservedRunningTime="2025-12-03 00:05:33.24860163 +0000 UTC m=+4071.957155913" Dec 03 00:05:36 crc kubenswrapper[4903]: I1203 00:05:36.086147 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:36 crc kubenswrapper[4903]: I1203 00:05:36.087618 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.171826 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4pwp" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="registry-server" probeResult="failure" output=< Dec 03 00:05:37 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:05:37 crc kubenswrapper[4903]: > Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.220033 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.222796 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.230588 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.317861 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.318465 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.318510 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mgt\" (UniqueName: \"kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.420440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.421005 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.421031 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.420641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.421121 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mgt\" (UniqueName: \"kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.872422 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mgt\" (UniqueName: \"kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt\") pod \"certified-operators-spvhv\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:37 crc kubenswrapper[4903]: I1203 00:05:37.893174 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:38 crc kubenswrapper[4903]: I1203 00:05:38.391282 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:38 crc kubenswrapper[4903]: I1203 00:05:38.392831 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:38 crc kubenswrapper[4903]: I1203 00:05:38.439870 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:38 crc kubenswrapper[4903]: W1203 00:05:38.446740 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa42696a_099a_4d82_8e9f_91e0b96824c7.slice/crio-9bb415ce914c66e6d79631b15e38336e664fbae780448040dd37c4b8a37ad979 WatchSource:0}: Error finding container 9bb415ce914c66e6d79631b15e38336e664fbae780448040dd37c4b8a37ad979: Status 404 returned error can't find the container with id 9bb415ce914c66e6d79631b15e38336e664fbae780448040dd37c4b8a37ad979 Dec 03 00:05:38 crc kubenswrapper[4903]: I1203 00:05:38.462262 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:39 crc kubenswrapper[4903]: I1203 00:05:39.252975 4903 generic.go:334] "Generic (PLEG): container finished" podID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerID="4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53" exitCode=0 Dec 03 00:05:39 crc kubenswrapper[4903]: I1203 00:05:39.253037 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerDied","Data":"4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53"} Dec 03 00:05:39 crc kubenswrapper[4903]: I1203 00:05:39.253364 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerStarted","Data":"9bb415ce914c66e6d79631b15e38336e664fbae780448040dd37c4b8a37ad979"} Dec 03 00:05:39 crc kubenswrapper[4903]: I1203 00:05:39.309516 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:40 crc kubenswrapper[4903]: I1203 00:05:40.264273 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerStarted","Data":"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85"} Dec 03 00:05:41 crc kubenswrapper[4903]: I1203 00:05:41.279239 4903 generic.go:334] "Generic (PLEG): container finished" podID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerID="a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85" exitCode=0 Dec 03 00:05:41 crc kubenswrapper[4903]: I1203 00:05:41.279348 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerDied","Data":"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85"} Dec 03 00:05:41 crc kubenswrapper[4903]: I1203 00:05:41.804071 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.296482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerStarted","Data":"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733"} Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.296766 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pmtg9" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="registry-server" containerID="cri-o://5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01" gracePeriod=2 Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.324622 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-spvhv" podStartSLOduration=2.875487066 podStartE2EDuration="5.324605533s" podCreationTimestamp="2025-12-03 00:05:37 +0000 UTC" firstStartedPulling="2025-12-03 00:05:39.256285508 +0000 UTC m=+4077.964839791" lastFinishedPulling="2025-12-03 00:05:41.705403975 +0000 UTC m=+4080.413958258" observedRunningTime="2025-12-03 00:05:42.321469727 +0000 UTC m=+4081.030024030" watchObservedRunningTime="2025-12-03 00:05:42.324605533 +0000 UTC m=+4081.033159816" Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.772597 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.951405 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content\") pod \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.951563 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities\") pod \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.951761 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq4kx\" (UniqueName: \"kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx\") pod \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\" (UID: \"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af\") " Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.952358 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities" (OuterVolumeSpecName: "utilities") pod "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" (UID: "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.957893 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx" (OuterVolumeSpecName: "kube-api-access-rq4kx") pod "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" (UID: "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af"). InnerVolumeSpecName "kube-api-access-rq4kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:05:42 crc kubenswrapper[4903]: I1203 00:05:42.993762 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" (UID: "4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.054170 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.054454 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.054467 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq4kx\" (UniqueName: \"kubernetes.io/projected/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af-kube-api-access-rq4kx\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.311369 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerID="5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01" exitCode=0 Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.311418 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerDied","Data":"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01"} Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.311472 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmtg9" event={"ID":"4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af","Type":"ContainerDied","Data":"852252ef75e965592dc4982789340992837464f12b13f0371bc212affe4f6989"} Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.311490 4903 scope.go:117] "RemoveContainer" containerID="5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.311490 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmtg9" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.363053 4903 scope.go:117] "RemoveContainer" containerID="f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.363209 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.372478 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmtg9"] Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.422826 4903 scope.go:117] "RemoveContainer" containerID="28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.458397 4903 scope.go:117] "RemoveContainer" containerID="5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01" Dec 03 00:05:43 crc kubenswrapper[4903]: E1203 00:05:43.462850 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01\": container with ID starting with 5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01 not found: ID does not exist" containerID="5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.462901 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01"} err="failed to get container status \"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01\": rpc error: code = NotFound desc = could not find container \"5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01\": container with ID starting with 5d99eaef0275f772a7786632edf5e4a6f767f4b3b4a8d92509f453a69d349c01 not found: ID does not exist" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.462930 4903 scope.go:117] "RemoveContainer" containerID="f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d" Dec 03 00:05:43 crc kubenswrapper[4903]: E1203 00:05:43.463721 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d\": container with ID starting with f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d not found: ID does not exist" containerID="f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.463746 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d"} err="failed to get container status \"f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d\": rpc error: code = NotFound desc = could not find container \"f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d\": container with ID starting with f4d486bac17c66dc3ed10dc08f1efc869ede8a107c32fe4cc29f17c2acc9d62d not found: ID does not exist" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.463759 4903 scope.go:117] "RemoveContainer" containerID="28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0" Dec 03 00:05:43 crc kubenswrapper[4903]: E1203 00:05:43.467772 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0\": container with ID starting with 28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0 not found: ID does not exist" containerID="28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.467805 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0"} err="failed to get container status \"28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0\": rpc error: code = NotFound desc = could not find container \"28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0\": container with ID starting with 28d12fa306223bebf91f18ef500b272d215378531e4ce147779566c1f57e01f0 not found: ID does not exist" Dec 03 00:05:43 crc kubenswrapper[4903]: I1203 00:05:43.628341 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" path="/var/lib/kubelet/pods/4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af/volumes" Dec 03 00:05:46 crc kubenswrapper[4903]: I1203 00:05:46.161487 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:46 crc kubenswrapper[4903]: I1203 00:05:46.242112 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:46 crc kubenswrapper[4903]: I1203 00:05:46.809182 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.359606 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k4pwp" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="registry-server" containerID="cri-o://3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e" gracePeriod=2 Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.815956 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.893845 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.893930 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.960516 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.967800 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccvv8\" (UniqueName: \"kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8\") pod \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.967967 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content\") pod \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.968027 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities\") pod \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\" (UID: \"6ae49558-22e0-4b43-b76f-e328a03bd6b7\") " Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.968609 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities" (OuterVolumeSpecName: "utilities") pod "6ae49558-22e0-4b43-b76f-e328a03bd6b7" (UID: "6ae49558-22e0-4b43-b76f-e328a03bd6b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.969312 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:47 crc kubenswrapper[4903]: I1203 00:05:47.979165 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8" (OuterVolumeSpecName: "kube-api-access-ccvv8") pod "6ae49558-22e0-4b43-b76f-e328a03bd6b7" (UID: "6ae49558-22e0-4b43-b76f-e328a03bd6b7"). InnerVolumeSpecName "kube-api-access-ccvv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.071556 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccvv8\" (UniqueName: \"kubernetes.io/projected/6ae49558-22e0-4b43-b76f-e328a03bd6b7-kube-api-access-ccvv8\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.124584 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ae49558-22e0-4b43-b76f-e328a03bd6b7" (UID: "6ae49558-22e0-4b43-b76f-e328a03bd6b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.174285 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae49558-22e0-4b43-b76f-e328a03bd6b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.375116 4903 generic.go:334] "Generic (PLEG): container finished" podID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerID="3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e" exitCode=0 Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.375232 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerDied","Data":"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e"} Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.375292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4pwp" event={"ID":"6ae49558-22e0-4b43-b76f-e328a03bd6b7","Type":"ContainerDied","Data":"e0aaffb20f9b0572ce80f39c5a80d1462275c8b3aedffec723535426869bde1f"} Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.375205 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4pwp" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.375317 4903 scope.go:117] "RemoveContainer" containerID="3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.402094 4903 scope.go:117] "RemoveContainer" containerID="48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.424149 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.433718 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k4pwp"] Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.452683 4903 scope.go:117] "RemoveContainer" containerID="6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.456730 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.479799 4903 scope.go:117] "RemoveContainer" containerID="3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e" Dec 03 00:05:48 crc kubenswrapper[4903]: E1203 00:05:48.480301 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e\": container with ID starting with 3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e not found: ID does not exist" containerID="3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.480340 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e"} err="failed to get container status \"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e\": rpc error: code = NotFound desc = could not find container \"3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e\": container with ID starting with 3dc80761d04a6e28d7a0577a70cde4db77b197fe7ca2851b4b4a1d880962cb6e not found: ID does not exist" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.480363 4903 scope.go:117] "RemoveContainer" containerID="48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07" Dec 03 00:05:48 crc kubenswrapper[4903]: E1203 00:05:48.480718 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07\": container with ID starting with 48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07 not found: ID does not exist" containerID="48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.480745 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07"} err="failed to get container status \"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07\": rpc error: code = NotFound desc = could not find container \"48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07\": container with ID starting with 48529383ee81592a0599efd74e6b8161ff572d6561b324d83138f691f2324d07 not found: ID does not exist" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.480762 4903 scope.go:117] "RemoveContainer" containerID="6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d" Dec 03 00:05:48 crc kubenswrapper[4903]: E1203 00:05:48.488455 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d\": container with ID starting with 6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d not found: ID does not exist" containerID="6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d" Dec 03 00:05:48 crc kubenswrapper[4903]: I1203 00:05:48.488545 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d"} err="failed to get container status \"6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d\": rpc error: code = NotFound desc = could not find container \"6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d\": container with ID starting with 6d275cca165cb129ebdb7813f9c5112fd806f9725de8ed8eca67cba45bd9869d not found: ID does not exist" Dec 03 00:05:49 crc kubenswrapper[4903]: I1203 00:05:49.632219 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" path="/var/lib/kubelet/pods/6ae49558-22e0-4b43-b76f-e328a03bd6b7/volumes" Dec 03 00:05:50 crc kubenswrapper[4903]: I1203 00:05:50.210079 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:50 crc kubenswrapper[4903]: I1203 00:05:50.396884 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-spvhv" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="registry-server" containerID="cri-o://29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733" gracePeriod=2 Dec 03 00:05:50 crc kubenswrapper[4903]: I1203 00:05:50.879196 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.033357 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4mgt\" (UniqueName: \"kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt\") pod \"aa42696a-099a-4d82-8e9f-91e0b96824c7\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.033472 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content\") pod \"aa42696a-099a-4d82-8e9f-91e0b96824c7\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.033525 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities\") pod \"aa42696a-099a-4d82-8e9f-91e0b96824c7\" (UID: \"aa42696a-099a-4d82-8e9f-91e0b96824c7\") " Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.034474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities" (OuterVolumeSpecName: "utilities") pod "aa42696a-099a-4d82-8e9f-91e0b96824c7" (UID: "aa42696a-099a-4d82-8e9f-91e0b96824c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.043593 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt" (OuterVolumeSpecName: "kube-api-access-l4mgt") pod "aa42696a-099a-4d82-8e9f-91e0b96824c7" (UID: "aa42696a-099a-4d82-8e9f-91e0b96824c7"). InnerVolumeSpecName "kube-api-access-l4mgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.086134 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa42696a-099a-4d82-8e9f-91e0b96824c7" (UID: "aa42696a-099a-4d82-8e9f-91e0b96824c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.136474 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.136505 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa42696a-099a-4d82-8e9f-91e0b96824c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.136517 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4mgt\" (UniqueName: \"kubernetes.io/projected/aa42696a-099a-4d82-8e9f-91e0b96824c7-kube-api-access-l4mgt\") on node \"crc\" DevicePath \"\"" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.409835 4903 generic.go:334] "Generic (PLEG): container finished" podID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerID="29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733" exitCode=0 Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.410026 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerDied","Data":"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733"} Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.410075 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvhv" event={"ID":"aa42696a-099a-4d82-8e9f-91e0b96824c7","Type":"ContainerDied","Data":"9bb415ce914c66e6d79631b15e38336e664fbae780448040dd37c4b8a37ad979"} Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.410097 4903 scope.go:117] "RemoveContainer" containerID="29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.410114 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvhv" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.434860 4903 scope.go:117] "RemoveContainer" containerID="a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.451771 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.462349 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-spvhv"] Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.625101 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" path="/var/lib/kubelet/pods/aa42696a-099a-4d82-8e9f-91e0b96824c7/volumes" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.891748 4903 scope.go:117] "RemoveContainer" containerID="4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.960783 4903 scope.go:117] "RemoveContainer" containerID="29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733" Dec 03 00:05:51 crc kubenswrapper[4903]: E1203 00:05:51.961227 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733\": container with ID starting with 29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733 not found: ID does not exist" containerID="29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.961270 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733"} err="failed to get container status \"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733\": rpc error: code = NotFound desc = could not find container \"29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733\": container with ID starting with 29dcfd2824b5b6d94aa1b6e80c7b6818f71faea73288741b0d24ec2527ccf733 not found: ID does not exist" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.961302 4903 scope.go:117] "RemoveContainer" containerID="a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85" Dec 03 00:05:51 crc kubenswrapper[4903]: E1203 00:05:51.961579 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85\": container with ID starting with a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85 not found: ID does not exist" containerID="a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.961601 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85"} err="failed to get container status \"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85\": rpc error: code = NotFound desc = could not find container \"a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85\": container with ID starting with a2eb9d1f00bb5fb3df15f6bdd00bb3d137c62f92293372a1d01640c3bc2b0a85 not found: ID does not exist" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.961618 4903 scope.go:117] "RemoveContainer" containerID="4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53" Dec 03 00:05:51 crc kubenswrapper[4903]: E1203 00:05:51.962123 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53\": container with ID starting with 4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53 not found: ID does not exist" containerID="4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53" Dec 03 00:05:51 crc kubenswrapper[4903]: I1203 00:05:51.962161 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53"} err="failed to get container status \"4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53\": rpc error: code = NotFound desc = could not find container \"4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53\": container with ID starting with 4e492abb1edb3aad45795792e34a92678a45f88d7644c8259eed646c73c7ae53 not found: ID does not exist" Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.070718 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.071147 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.071210 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.072434 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.072548 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20" gracePeriod=600 Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.436413 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20" exitCode=0 Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.436568 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20"} Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.437091 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176"} Dec 03 00:05:53 crc kubenswrapper[4903]: I1203 00:05:53.437256 4903 scope.go:117] "RemoveContainer" containerID="9b0d8e1ad42f16d25d8ace883915db986228e905bf7242685a201d23108ad65b" Dec 03 00:07:53 crc kubenswrapper[4903]: I1203 00:07:53.070193 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:07:53 crc kubenswrapper[4903]: I1203 00:07:53.070840 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:08:23 crc kubenswrapper[4903]: I1203 00:08:23.069744 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:08:23 crc kubenswrapper[4903]: I1203 00:08:23.070333 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.070143 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.071718 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.071855 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.072832 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.072982 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" gracePeriod=600 Dec 03 00:08:53 crc kubenswrapper[4903]: E1203 00:08:53.219878 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.325396 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" exitCode=0 Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.325494 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176"} Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.326102 4903 scope.go:117] "RemoveContainer" containerID="70ae78185bce033151741dd6b3782e470c6fa11de56eade3dfb49ff2c5f74c20" Dec 03 00:08:53 crc kubenswrapper[4903]: I1203 00:08:53.326861 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:08:53 crc kubenswrapper[4903]: E1203 00:08:53.327173 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:09:07 crc kubenswrapper[4903]: I1203 00:09:07.612760 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:09:07 crc kubenswrapper[4903]: E1203 00:09:07.613460 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:09:22 crc kubenswrapper[4903]: I1203 00:09:22.613047 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:09:22 crc kubenswrapper[4903]: E1203 00:09:22.615234 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:09:37 crc kubenswrapper[4903]: I1203 00:09:37.613161 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:09:37 crc kubenswrapper[4903]: E1203 00:09:37.613936 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:09:50 crc kubenswrapper[4903]: I1203 00:09:50.612345 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:09:50 crc kubenswrapper[4903]: E1203 00:09:50.613184 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:10:05 crc kubenswrapper[4903]: I1203 00:10:05.612431 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:10:05 crc kubenswrapper[4903]: E1203 00:10:05.613338 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:10:19 crc kubenswrapper[4903]: I1203 00:10:19.613777 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:10:19 crc kubenswrapper[4903]: E1203 00:10:19.615064 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:10:32 crc kubenswrapper[4903]: I1203 00:10:32.613037 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:10:32 crc kubenswrapper[4903]: E1203 00:10:32.613857 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:10:44 crc kubenswrapper[4903]: I1203 00:10:44.612763 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:10:44 crc kubenswrapper[4903]: E1203 00:10:44.614985 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:10:56 crc kubenswrapper[4903]: I1203 00:10:56.612976 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:10:56 crc kubenswrapper[4903]: E1203 00:10:56.613836 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:11:10 crc kubenswrapper[4903]: I1203 00:11:10.627967 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:11:10 crc kubenswrapper[4903]: E1203 00:11:10.628715 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:11:22 crc kubenswrapper[4903]: I1203 00:11:22.773494 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a3fa7901-a49c-433f-942c-a875c9ecd2ab" containerName="galera" probeResult="failure" output="command timed out" Dec 03 00:11:22 crc kubenswrapper[4903]: I1203 00:11:22.774559 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="a3fa7901-a49c-433f-942c-a875c9ecd2ab" containerName="galera" probeResult="failure" output="command timed out" Dec 03 00:11:23 crc kubenswrapper[4903]: I1203 00:11:23.613228 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:11:23 crc kubenswrapper[4903]: E1203 00:11:23.614069 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:11:37 crc kubenswrapper[4903]: I1203 00:11:37.612805 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:11:37 crc kubenswrapper[4903]: E1203 00:11:37.613497 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:11:50 crc kubenswrapper[4903]: I1203 00:11:50.613456 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:11:50 crc kubenswrapper[4903]: E1203 00:11:50.614245 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:12:04 crc kubenswrapper[4903]: I1203 00:12:04.612027 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:12:04 crc kubenswrapper[4903]: E1203 00:12:04.612838 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.470804 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.472944 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473054 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473137 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473211 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473307 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473377 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473460 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473536 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473688 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473720 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="extract-utilities" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473734 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473742 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473767 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473774 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473785 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473790 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: E1203 00:12:13.473824 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.473832 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="extract-content" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.474175 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa42696a-099a-4d82-8e9f-91e0b96824c7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.474208 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1bb8da-ffd4-4f85-b55a-d2853f9ee4af" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.474219 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae49558-22e0-4b43-b76f-e328a03bd6b7" containerName="registry-server" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.475725 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.480791 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.619075 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.619197 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnj5\" (UniqueName: \"kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.619410 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.720972 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.721100 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.721173 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnj5\" (UniqueName: \"kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.721522 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.721570 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.742622 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnj5\" (UniqueName: \"kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5\") pod \"community-operators-t5z55\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:13 crc kubenswrapper[4903]: I1203 00:12:13.808113 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:14 crc kubenswrapper[4903]: I1203 00:12:14.307430 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:14 crc kubenswrapper[4903]: I1203 00:12:14.315831 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerStarted","Data":"77c9d82c258281cd4e4eb42c40a06f4c2f7579b99f0b8b7e990e25019a019a64"} Dec 03 00:12:15 crc kubenswrapper[4903]: I1203 00:12:15.324165 4903 generic.go:334] "Generic (PLEG): container finished" podID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerID="289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5" exitCode=0 Dec 03 00:12:15 crc kubenswrapper[4903]: I1203 00:12:15.324258 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerDied","Data":"289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5"} Dec 03 00:12:15 crc kubenswrapper[4903]: I1203 00:12:15.325885 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:12:15 crc kubenswrapper[4903]: I1203 00:12:15.612985 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:12:15 crc kubenswrapper[4903]: E1203 00:12:15.613244 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:12:16 crc kubenswrapper[4903]: I1203 00:12:16.335452 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerStarted","Data":"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141"} Dec 03 00:12:17 crc kubenswrapper[4903]: I1203 00:12:17.347425 4903 generic.go:334] "Generic (PLEG): container finished" podID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerID="80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141" exitCode=0 Dec 03 00:12:17 crc kubenswrapper[4903]: I1203 00:12:17.347626 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerDied","Data":"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141"} Dec 03 00:12:18 crc kubenswrapper[4903]: I1203 00:12:18.359095 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerStarted","Data":"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d"} Dec 03 00:12:18 crc kubenswrapper[4903]: I1203 00:12:18.388614 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t5z55" podStartSLOduration=2.782569513 podStartE2EDuration="5.388595481s" podCreationTimestamp="2025-12-03 00:12:13 +0000 UTC" firstStartedPulling="2025-12-03 00:12:15.325694706 +0000 UTC m=+4474.034248979" lastFinishedPulling="2025-12-03 00:12:17.931720664 +0000 UTC m=+4476.640274947" observedRunningTime="2025-12-03 00:12:18.38068414 +0000 UTC m=+4477.089238443" watchObservedRunningTime="2025-12-03 00:12:18.388595481 +0000 UTC m=+4477.097149764" Dec 03 00:12:23 crc kubenswrapper[4903]: I1203 00:12:23.809040 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:23 crc kubenswrapper[4903]: I1203 00:12:23.809580 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:23 crc kubenswrapper[4903]: I1203 00:12:23.857448 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:24 crc kubenswrapper[4903]: I1203 00:12:24.491418 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:24 crc kubenswrapper[4903]: I1203 00:12:24.540472 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:26 crc kubenswrapper[4903]: I1203 00:12:26.455339 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t5z55" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="registry-server" containerID="cri-o://dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d" gracePeriod=2 Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.455924 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.464197 4903 generic.go:334] "Generic (PLEG): container finished" podID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerID="dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d" exitCode=0 Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.464251 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t5z55" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.464268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerDied","Data":"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d"} Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.464669 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t5z55" event={"ID":"36d281cd-cd71-4f69-8a08-77e06dccc5b5","Type":"ContainerDied","Data":"77c9d82c258281cd4e4eb42c40a06f4c2f7579b99f0b8b7e990e25019a019a64"} Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.464704 4903 scope.go:117] "RemoveContainer" containerID="dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.492945 4903 scope.go:117] "RemoveContainer" containerID="80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.519085 4903 scope.go:117] "RemoveContainer" containerID="289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.568563 4903 scope.go:117] "RemoveContainer" containerID="dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d" Dec 03 00:12:27 crc kubenswrapper[4903]: E1203 00:12:27.569038 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d\": container with ID starting with dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d not found: ID does not exist" containerID="dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.569119 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d"} err="failed to get container status \"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d\": rpc error: code = NotFound desc = could not find container \"dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d\": container with ID starting with dcdf66dcf095413be0531aec17eb9976914e7cb87aebfb4e27ec2e18f7b33f5d not found: ID does not exist" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.569153 4903 scope.go:117] "RemoveContainer" containerID="80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141" Dec 03 00:12:27 crc kubenswrapper[4903]: E1203 00:12:27.569644 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141\": container with ID starting with 80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141 not found: ID does not exist" containerID="80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.569689 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141"} err="failed to get container status \"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141\": rpc error: code = NotFound desc = could not find container \"80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141\": container with ID starting with 80b40f4dc77b1671130c4d82173a6203e99bef37a503eeac963035fefc399141 not found: ID does not exist" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.569707 4903 scope.go:117] "RemoveContainer" containerID="289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5" Dec 03 00:12:27 crc kubenswrapper[4903]: E1203 00:12:27.571114 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5\": container with ID starting with 289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5 not found: ID does not exist" containerID="289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.571170 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5"} err="failed to get container status \"289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5\": rpc error: code = NotFound desc = could not find container \"289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5\": container with ID starting with 289dc634d8c675aafb2c3f668aa6f5e996fe1aab3735cd4d647b775878cdfee5 not found: ID does not exist" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.608531 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content\") pod \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.608609 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knnj5\" (UniqueName: \"kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5\") pod \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.608794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities\") pod \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\" (UID: \"36d281cd-cd71-4f69-8a08-77e06dccc5b5\") " Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.611123 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities" (OuterVolumeSpecName: "utilities") pod "36d281cd-cd71-4f69-8a08-77e06dccc5b5" (UID: "36d281cd-cd71-4f69-8a08-77e06dccc5b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.614205 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5" (OuterVolumeSpecName: "kube-api-access-knnj5") pod "36d281cd-cd71-4f69-8a08-77e06dccc5b5" (UID: "36d281cd-cd71-4f69-8a08-77e06dccc5b5"). InnerVolumeSpecName "kube-api-access-knnj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.660260 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d281cd-cd71-4f69-8a08-77e06dccc5b5" (UID: "36d281cd-cd71-4f69-8a08-77e06dccc5b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.712018 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.712165 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knnj5\" (UniqueName: \"kubernetes.io/projected/36d281cd-cd71-4f69-8a08-77e06dccc5b5-kube-api-access-knnj5\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:27 crc kubenswrapper[4903]: I1203 00:12:27.712181 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d281cd-cd71-4f69-8a08-77e06dccc5b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:12:28 crc kubenswrapper[4903]: I1203 00:12:28.477293 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:28 crc kubenswrapper[4903]: I1203 00:12:28.485486 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t5z55"] Dec 03 00:12:28 crc kubenswrapper[4903]: I1203 00:12:28.612452 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:12:28 crc kubenswrapper[4903]: E1203 00:12:28.612966 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:12:29 crc kubenswrapper[4903]: I1203 00:12:29.628146 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" path="/var/lib/kubelet/pods/36d281cd-cd71-4f69-8a08-77e06dccc5b5/volumes" Dec 03 00:12:42 crc kubenswrapper[4903]: I1203 00:12:42.612404 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:12:42 crc kubenswrapper[4903]: E1203 00:12:42.613379 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:12:53 crc kubenswrapper[4903]: I1203 00:12:53.612330 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:12:53 crc kubenswrapper[4903]: E1203 00:12:53.613120 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:13:07 crc kubenswrapper[4903]: I1203 00:13:07.613259 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:13:07 crc kubenswrapper[4903]: E1203 00:13:07.613961 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:13:22 crc kubenswrapper[4903]: I1203 00:13:22.613297 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:13:22 crc kubenswrapper[4903]: E1203 00:13:22.614417 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:13:34 crc kubenswrapper[4903]: I1203 00:13:34.612496 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:13:34 crc kubenswrapper[4903]: E1203 00:13:34.613191 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:13:45 crc kubenswrapper[4903]: I1203 00:13:45.612394 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:13:45 crc kubenswrapper[4903]: E1203 00:13:45.613221 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:14:00 crc kubenswrapper[4903]: I1203 00:14:00.612467 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:14:01 crc kubenswrapper[4903]: I1203 00:14:01.505163 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac"} Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.173585 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz"] Dec 03 00:15:00 crc kubenswrapper[4903]: E1203 00:15:00.174847 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.174867 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4903]: E1203 00:15:00.174888 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="extract-utilities" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.174896 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="extract-utilities" Dec 03 00:15:00 crc kubenswrapper[4903]: E1203 00:15:00.174929 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="extract-content" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.174937 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="extract-content" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.175231 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d281cd-cd71-4f69-8a08-77e06dccc5b5" containerName="registry-server" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.176112 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.178722 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.179331 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.193411 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz"] Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.284397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.284486 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htx9\" (UniqueName: \"kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.284642 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.386851 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.386936 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htx9\" (UniqueName: \"kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.387122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.388171 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.869445 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:00 crc kubenswrapper[4903]: I1203 00:15:00.873378 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htx9\" (UniqueName: \"kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9\") pod \"collect-profiles-29412015-gr9cz\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:01 crc kubenswrapper[4903]: I1203 00:15:01.103235 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:01 crc kubenswrapper[4903]: I1203 00:15:01.586197 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz"] Dec 03 00:15:02 crc kubenswrapper[4903]: I1203 00:15:02.161171 4903 generic.go:334] "Generic (PLEG): container finished" podID="dcd58fc4-0984-4efc-b904-8d7318c0f662" containerID="7d748d4b86463929fcb30f1bfbfb9fabf8365988620fbbc592b0cc4de0666836" exitCode=0 Dec 03 00:15:02 crc kubenswrapper[4903]: I1203 00:15:02.161230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" event={"ID":"dcd58fc4-0984-4efc-b904-8d7318c0f662","Type":"ContainerDied","Data":"7d748d4b86463929fcb30f1bfbfb9fabf8365988620fbbc592b0cc4de0666836"} Dec 03 00:15:02 crc kubenswrapper[4903]: I1203 00:15:02.161810 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" event={"ID":"dcd58fc4-0984-4efc-b904-8d7318c0f662","Type":"ContainerStarted","Data":"e873cd362dc325e003dde18c2c0e5f85f5d57f50eb3e4cebfe9ef90b38e059a0"} Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.512941 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.650985 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume\") pod \"dcd58fc4-0984-4efc-b904-8d7318c0f662\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.651313 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htx9\" (UniqueName: \"kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9\") pod \"dcd58fc4-0984-4efc-b904-8d7318c0f662\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.651407 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume\") pod \"dcd58fc4-0984-4efc-b904-8d7318c0f662\" (UID: \"dcd58fc4-0984-4efc-b904-8d7318c0f662\") " Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.652032 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume" (OuterVolumeSpecName: "config-volume") pod "dcd58fc4-0984-4efc-b904-8d7318c0f662" (UID: "dcd58fc4-0984-4efc-b904-8d7318c0f662"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.652147 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd58fc4-0984-4efc-b904-8d7318c0f662-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.658916 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dcd58fc4-0984-4efc-b904-8d7318c0f662" (UID: "dcd58fc4-0984-4efc-b904-8d7318c0f662"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.660263 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9" (OuterVolumeSpecName: "kube-api-access-5htx9") pod "dcd58fc4-0984-4efc-b904-8d7318c0f662" (UID: "dcd58fc4-0984-4efc-b904-8d7318c0f662"). InnerVolumeSpecName "kube-api-access-5htx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.754607 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd58fc4-0984-4efc-b904-8d7318c0f662-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:03 crc kubenswrapper[4903]: I1203 00:15:03.754639 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htx9\" (UniqueName: \"kubernetes.io/projected/dcd58fc4-0984-4efc-b904-8d7318c0f662-kube-api-access-5htx9\") on node \"crc\" DevicePath \"\"" Dec 03 00:15:04 crc kubenswrapper[4903]: I1203 00:15:04.183724 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" event={"ID":"dcd58fc4-0984-4efc-b904-8d7318c0f662","Type":"ContainerDied","Data":"e873cd362dc325e003dde18c2c0e5f85f5d57f50eb3e4cebfe9ef90b38e059a0"} Dec 03 00:15:04 crc kubenswrapper[4903]: I1203 00:15:04.183782 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e873cd362dc325e003dde18c2c0e5f85f5d57f50eb3e4cebfe9ef90b38e059a0" Dec 03 00:15:04 crc kubenswrapper[4903]: I1203 00:15:04.183846 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412015-gr9cz" Dec 03 00:15:04 crc kubenswrapper[4903]: I1203 00:15:04.593410 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb"] Dec 03 00:15:04 crc kubenswrapper[4903]: I1203 00:15:04.603991 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411970-crlbb"] Dec 03 00:15:05 crc kubenswrapper[4903]: I1203 00:15:05.627603 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb6ab4e-3770-47d1-8796-6a58ea453293" path="/var/lib/kubelet/pods/3cb6ab4e-3770-47d1-8796-6a58ea453293/volumes" Dec 03 00:15:05 crc kubenswrapper[4903]: I1203 00:15:05.825004 4903 scope.go:117] "RemoveContainer" containerID="631ec00f5442f7c8ac19f024c1902e27f55e9ddc055a4c772966fd86d18c3f1f" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.093890 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:15:52 crc kubenswrapper[4903]: E1203 00:15:52.095133 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd58fc4-0984-4efc-b904-8d7318c0f662" containerName="collect-profiles" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.095154 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd58fc4-0984-4efc-b904-8d7318c0f662" containerName="collect-profiles" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.095411 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd58fc4-0984-4efc-b904-8d7318c0f662" containerName="collect-profiles" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.097794 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.107791 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.195986 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmrj\" (UniqueName: \"kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.196160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.196286 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.299046 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmrj\" (UniqueName: \"kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.299338 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.299507 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.299951 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.300126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.319851 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmrj\" (UniqueName: \"kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj\") pod \"redhat-marketplace-ls5hb\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.433273 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:15:52 crc kubenswrapper[4903]: I1203 00:15:52.932971 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:15:53 crc kubenswrapper[4903]: I1203 00:15:53.724215 4903 generic.go:334] "Generic (PLEG): container finished" podID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerID="c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0" exitCode=0 Dec 03 00:15:53 crc kubenswrapper[4903]: I1203 00:15:53.724302 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerDied","Data":"c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0"} Dec 03 00:15:53 crc kubenswrapper[4903]: I1203 00:15:53.724482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerStarted","Data":"e3c965cfa59b16608373de9d91b170b6e1111f68e9d80c5906ebc7748961c551"} Dec 03 00:15:54 crc kubenswrapper[4903]: I1203 00:15:54.740596 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerStarted","Data":"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120"} Dec 03 00:15:55 crc kubenswrapper[4903]: I1203 00:15:55.754037 4903 generic.go:334] "Generic (PLEG): container finished" podID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerID="1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120" exitCode=0 Dec 03 00:15:55 crc kubenswrapper[4903]: I1203 00:15:55.754178 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerDied","Data":"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120"} Dec 03 00:15:56 crc kubenswrapper[4903]: I1203 00:15:56.770613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerStarted","Data":"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df"} Dec 03 00:15:56 crc kubenswrapper[4903]: I1203 00:15:56.803316 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ls5hb" podStartSLOduration=2.391918121 podStartE2EDuration="4.803293303s" podCreationTimestamp="2025-12-03 00:15:52 +0000 UTC" firstStartedPulling="2025-12-03 00:15:53.726935039 +0000 UTC m=+4692.435489322" lastFinishedPulling="2025-12-03 00:15:56.138310211 +0000 UTC m=+4694.846864504" observedRunningTime="2025-12-03 00:15:56.789980101 +0000 UTC m=+4695.498534404" watchObservedRunningTime="2025-12-03 00:15:56.803293303 +0000 UTC m=+4695.511847586" Dec 03 00:16:02 crc kubenswrapper[4903]: I1203 00:16:02.433735 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:02 crc kubenswrapper[4903]: I1203 00:16:02.434349 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:02 crc kubenswrapper[4903]: I1203 00:16:02.487704 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:02 crc kubenswrapper[4903]: I1203 00:16:02.896806 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:02 crc kubenswrapper[4903]: I1203 00:16:02.947515 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:16:04 crc kubenswrapper[4903]: I1203 00:16:04.856951 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ls5hb" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="registry-server" containerID="cri-o://730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df" gracePeriod=2 Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.355943 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.472065 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities\") pod \"85ae109a-7ec8-429f-9114-8acaf129c82d\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.472480 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content\") pod \"85ae109a-7ec8-429f-9114-8acaf129c82d\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.472516 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmrj\" (UniqueName: \"kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj\") pod \"85ae109a-7ec8-429f-9114-8acaf129c82d\" (UID: \"85ae109a-7ec8-429f-9114-8acaf129c82d\") " Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.473334 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities" (OuterVolumeSpecName: "utilities") pod "85ae109a-7ec8-429f-9114-8acaf129c82d" (UID: "85ae109a-7ec8-429f-9114-8acaf129c82d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.479348 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj" (OuterVolumeSpecName: "kube-api-access-wpmrj") pod "85ae109a-7ec8-429f-9114-8acaf129c82d" (UID: "85ae109a-7ec8-429f-9114-8acaf129c82d"). InnerVolumeSpecName "kube-api-access-wpmrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.490550 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85ae109a-7ec8-429f-9114-8acaf129c82d" (UID: "85ae109a-7ec8-429f-9114-8acaf129c82d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.574547 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.574803 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ae109a-7ec8-429f-9114-8acaf129c82d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.574875 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmrj\" (UniqueName: \"kubernetes.io/projected/85ae109a-7ec8-429f-9114-8acaf129c82d-kube-api-access-wpmrj\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.868468 4903 generic.go:334] "Generic (PLEG): container finished" podID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerID="730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df" exitCode=0 Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.868523 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerDied","Data":"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df"} Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.868553 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls5hb" event={"ID":"85ae109a-7ec8-429f-9114-8acaf129c82d","Type":"ContainerDied","Data":"e3c965cfa59b16608373de9d91b170b6e1111f68e9d80c5906ebc7748961c551"} Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.868557 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls5hb" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.868573 4903 scope.go:117] "RemoveContainer" containerID="730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.894612 4903 scope.go:117] "RemoveContainer" containerID="1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.903075 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.919362 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls5hb"] Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.923753 4903 scope.go:117] "RemoveContainer" containerID="c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.985695 4903 scope.go:117] "RemoveContainer" containerID="730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df" Dec 03 00:16:05 crc kubenswrapper[4903]: E1203 00:16:05.986213 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df\": container with ID starting with 730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df not found: ID does not exist" containerID="730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.986243 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df"} err="failed to get container status \"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df\": rpc error: code = NotFound desc = could not find container \"730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df\": container with ID starting with 730a719d6c3a5df9e47ec8d9050e0d98b49d46042791abf7acfa642bc02618df not found: ID does not exist" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.986267 4903 scope.go:117] "RemoveContainer" containerID="1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120" Dec 03 00:16:05 crc kubenswrapper[4903]: E1203 00:16:05.986706 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120\": container with ID starting with 1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120 not found: ID does not exist" containerID="1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.986759 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120"} err="failed to get container status \"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120\": rpc error: code = NotFound desc = could not find container \"1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120\": container with ID starting with 1a0dbb9e9d605dbfc29cbb17e07beab8d497f1e3b19f9d556b91616b6c8b8120 not found: ID does not exist" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.986792 4903 scope.go:117] "RemoveContainer" containerID="c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0" Dec 03 00:16:05 crc kubenswrapper[4903]: E1203 00:16:05.987078 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0\": container with ID starting with c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0 not found: ID does not exist" containerID="c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0" Dec 03 00:16:05 crc kubenswrapper[4903]: I1203 00:16:05.987105 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0"} err="failed to get container status \"c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0\": rpc error: code = NotFound desc = could not find container \"c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0\": container with ID starting with c0e49615f000b86f2a4af4c49b77dee3a433507ad01a37f31a5bd0a3e3145fc0 not found: ID does not exist" Dec 03 00:16:07 crc kubenswrapper[4903]: I1203 00:16:07.623158 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" path="/var/lib/kubelet/pods/85ae109a-7ec8-429f-9114-8acaf129c82d/volumes" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.580323 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:20 crc kubenswrapper[4903]: E1203 00:16:20.582965 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="extract-content" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.583000 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="extract-content" Dec 03 00:16:20 crc kubenswrapper[4903]: E1203 00:16:20.583033 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="extract-utilities" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.583041 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="extract-utilities" Dec 03 00:16:20 crc kubenswrapper[4903]: E1203 00:16:20.583058 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="registry-server" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.583066 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="registry-server" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.583351 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ae109a-7ec8-429f-9114-8acaf129c82d" containerName="registry-server" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.585424 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.593708 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.698740 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzvv\" (UniqueName: \"kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.698913 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.699377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.801919 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.802178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.802236 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzvv\" (UniqueName: \"kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.802453 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.802933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.827391 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzvv\" (UniqueName: \"kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv\") pod \"redhat-operators-7k6p8\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:20 crc kubenswrapper[4903]: I1203 00:16:20.922978 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:21 crc kubenswrapper[4903]: I1203 00:16:21.598936 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:22 crc kubenswrapper[4903]: I1203 00:16:22.058603 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerID="7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9" exitCode=0 Dec 03 00:16:22 crc kubenswrapper[4903]: I1203 00:16:22.058690 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerDied","Data":"7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9"} Dec 03 00:16:22 crc kubenswrapper[4903]: I1203 00:16:22.058895 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerStarted","Data":"0952b97b6844d07f75fa8d5645dcc5e918374233e6a619eb44152f414b231b3b"} Dec 03 00:16:23 crc kubenswrapper[4903]: I1203 00:16:23.069486 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:16:23 crc kubenswrapper[4903]: I1203 00:16:23.071815 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:16:23 crc kubenswrapper[4903]: I1203 00:16:23.085576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerStarted","Data":"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae"} Dec 03 00:16:27 crc kubenswrapper[4903]: I1203 00:16:27.134232 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerID="76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae" exitCode=0 Dec 03 00:16:27 crc kubenswrapper[4903]: I1203 00:16:27.134280 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerDied","Data":"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae"} Dec 03 00:16:29 crc kubenswrapper[4903]: I1203 00:16:29.155822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerStarted","Data":"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15"} Dec 03 00:16:29 crc kubenswrapper[4903]: I1203 00:16:29.178806 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7k6p8" podStartSLOduration=2.644689488 podStartE2EDuration="9.178788283s" podCreationTimestamp="2025-12-03 00:16:20 +0000 UTC" firstStartedPulling="2025-12-03 00:16:22.061170845 +0000 UTC m=+4720.769725138" lastFinishedPulling="2025-12-03 00:16:28.59526963 +0000 UTC m=+4727.303823933" observedRunningTime="2025-12-03 00:16:29.176370175 +0000 UTC m=+4727.884924488" watchObservedRunningTime="2025-12-03 00:16:29.178788283 +0000 UTC m=+4727.887342566" Dec 03 00:16:30 crc kubenswrapper[4903]: I1203 00:16:30.923964 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:30 crc kubenswrapper[4903]: I1203 00:16:30.924291 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.067300 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.070503 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.084816 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.235232 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnpj\" (UniqueName: \"kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.235364 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.235508 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.337900 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnpj\" (UniqueName: \"kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.338102 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.338151 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.338674 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.338782 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.364672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnpj\" (UniqueName: \"kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj\") pod \"certified-operators-7vfrg\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.395551 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:31 crc kubenswrapper[4903]: I1203 00:16:31.977331 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7k6p8" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" probeResult="failure" output=< Dec 03 00:16:31 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:16:31 crc kubenswrapper[4903]: > Dec 03 00:16:32 crc kubenswrapper[4903]: I1203 00:16:32.008709 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:32 crc kubenswrapper[4903]: W1203 00:16:32.019271 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b24a8f_a0b1_4ff5_9bf9_6e5ab436db14.slice/crio-868d87ad5efcaf823ab492373e2e33f32b88fd06c008f5e535e134d04f8d9db7 WatchSource:0}: Error finding container 868d87ad5efcaf823ab492373e2e33f32b88fd06c008f5e535e134d04f8d9db7: Status 404 returned error can't find the container with id 868d87ad5efcaf823ab492373e2e33f32b88fd06c008f5e535e134d04f8d9db7 Dec 03 00:16:32 crc kubenswrapper[4903]: I1203 00:16:32.185129 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerStarted","Data":"868d87ad5efcaf823ab492373e2e33f32b88fd06c008f5e535e134d04f8d9db7"} Dec 03 00:16:33 crc kubenswrapper[4903]: I1203 00:16:33.194726 4903 generic.go:334] "Generic (PLEG): container finished" podID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerID="74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd" exitCode=0 Dec 03 00:16:33 crc kubenswrapper[4903]: I1203 00:16:33.194833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerDied","Data":"74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd"} Dec 03 00:16:34 crc kubenswrapper[4903]: I1203 00:16:34.207507 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerStarted","Data":"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e"} Dec 03 00:16:35 crc kubenswrapper[4903]: I1203 00:16:35.217308 4903 generic.go:334] "Generic (PLEG): container finished" podID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerID="1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e" exitCode=0 Dec 03 00:16:35 crc kubenswrapper[4903]: I1203 00:16:35.217411 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerDied","Data":"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e"} Dec 03 00:16:36 crc kubenswrapper[4903]: I1203 00:16:36.231308 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerStarted","Data":"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c"} Dec 03 00:16:36 crc kubenswrapper[4903]: I1203 00:16:36.263963 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vfrg" podStartSLOduration=2.722917232 podStartE2EDuration="5.263937258s" podCreationTimestamp="2025-12-03 00:16:31 +0000 UTC" firstStartedPulling="2025-12-03 00:16:33.197823191 +0000 UTC m=+4731.906377504" lastFinishedPulling="2025-12-03 00:16:35.738843247 +0000 UTC m=+4734.447397530" observedRunningTime="2025-12-03 00:16:36.252094192 +0000 UTC m=+4734.960648485" watchObservedRunningTime="2025-12-03 00:16:36.263937258 +0000 UTC m=+4734.972491551" Dec 03 00:16:41 crc kubenswrapper[4903]: I1203 00:16:41.395833 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:41 crc kubenswrapper[4903]: I1203 00:16:41.396368 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:41 crc kubenswrapper[4903]: I1203 00:16:41.460075 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:41 crc kubenswrapper[4903]: I1203 00:16:41.981593 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7k6p8" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" probeResult="failure" output=< Dec 03 00:16:41 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:16:41 crc kubenswrapper[4903]: > Dec 03 00:16:42 crc kubenswrapper[4903]: I1203 00:16:42.366070 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:42 crc kubenswrapper[4903]: I1203 00:16:42.415345 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:44 crc kubenswrapper[4903]: I1203 00:16:44.333035 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vfrg" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="registry-server" containerID="cri-o://b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c" gracePeriod=2 Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.228188 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.343849 4903 generic.go:334] "Generic (PLEG): container finished" podID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerID="b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c" exitCode=0 Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.343889 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vfrg" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.343894 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerDied","Data":"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c"} Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.343923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vfrg" event={"ID":"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14","Type":"ContainerDied","Data":"868d87ad5efcaf823ab492373e2e33f32b88fd06c008f5e535e134d04f8d9db7"} Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.343941 4903 scope.go:117] "RemoveContainer" containerID="b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.370701 4903 scope.go:117] "RemoveContainer" containerID="1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.379142 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnpj\" (UniqueName: \"kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj\") pod \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.379355 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content\") pod \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.379518 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities\") pod \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\" (UID: \"64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14\") " Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.383884 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities" (OuterVolumeSpecName: "utilities") pod "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" (UID: "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.390059 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj" (OuterVolumeSpecName: "kube-api-access-cwnpj") pod "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" (UID: "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14"). InnerVolumeSpecName "kube-api-access-cwnpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.408231 4903 scope.go:117] "RemoveContainer" containerID="74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.424369 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" (UID: "64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.481552 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.481577 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.481588 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnpj\" (UniqueName: \"kubernetes.io/projected/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14-kube-api-access-cwnpj\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.530170 4903 scope.go:117] "RemoveContainer" containerID="b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c" Dec 03 00:16:45 crc kubenswrapper[4903]: E1203 00:16:45.530614 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c\": container with ID starting with b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c not found: ID does not exist" containerID="b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.530720 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c"} err="failed to get container status \"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c\": rpc error: code = NotFound desc = could not find container \"b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c\": container with ID starting with b6027c457e00bc004fcf9c143a88d5ba9fab1dadea4ca0c143fbfef8bd3d5b9c not found: ID does not exist" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.530747 4903 scope.go:117] "RemoveContainer" containerID="1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e" Dec 03 00:16:45 crc kubenswrapper[4903]: E1203 00:16:45.531101 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e\": container with ID starting with 1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e not found: ID does not exist" containerID="1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.531157 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e"} err="failed to get container status \"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e\": rpc error: code = NotFound desc = could not find container \"1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e\": container with ID starting with 1d9ad5848fde10c3ebf3baeb5a26bcf2a25b19f51bcea2911915c4717d24816e not found: ID does not exist" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.531192 4903 scope.go:117] "RemoveContainer" containerID="74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd" Dec 03 00:16:45 crc kubenswrapper[4903]: E1203 00:16:45.531542 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd\": container with ID starting with 74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd not found: ID does not exist" containerID="74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.531588 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd"} err="failed to get container status \"74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd\": rpc error: code = NotFound desc = could not find container \"74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd\": container with ID starting with 74ed232161947e4aec6e7c3ca48f99f74be55b53b71ae9de3be337e9168523cd not found: ID does not exist" Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.675916 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:45 crc kubenswrapper[4903]: I1203 00:16:45.682773 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vfrg"] Dec 03 00:16:47 crc kubenswrapper[4903]: I1203 00:16:47.627036 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" path="/var/lib/kubelet/pods/64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14/volumes" Dec 03 00:16:50 crc kubenswrapper[4903]: I1203 00:16:50.981982 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:51 crc kubenswrapper[4903]: I1203 00:16:51.042036 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:51 crc kubenswrapper[4903]: I1203 00:16:51.778314 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:52 crc kubenswrapper[4903]: I1203 00:16:52.417067 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7k6p8" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" containerID="cri-o://d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15" gracePeriod=2 Dec 03 00:16:52 crc kubenswrapper[4903]: I1203 00:16:52.910738 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.037522 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content\") pod \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.037731 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svzvv\" (UniqueName: \"kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv\") pod \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.037803 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities\") pod \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\" (UID: \"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74\") " Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.038899 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities" (OuterVolumeSpecName: "utilities") pod "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" (UID: "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.048108 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv" (OuterVolumeSpecName: "kube-api-access-svzvv") pod "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" (UID: "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74"). InnerVolumeSpecName "kube-api-access-svzvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.070460 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.070521 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.140308 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svzvv\" (UniqueName: \"kubernetes.io/projected/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-kube-api-access-svzvv\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.140594 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.148348 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" (UID: "2b38f7a5-ca3c-4b8e-8799-4e898b1bda74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.241809 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.430358 4903 generic.go:334] "Generic (PLEG): container finished" podID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerID="d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15" exitCode=0 Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.430502 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k6p8" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.431491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerDied","Data":"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15"} Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.431574 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k6p8" event={"ID":"2b38f7a5-ca3c-4b8e-8799-4e898b1bda74","Type":"ContainerDied","Data":"0952b97b6844d07f75fa8d5645dcc5e918374233e6a619eb44152f414b231b3b"} Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.431666 4903 scope.go:117] "RemoveContainer" containerID="d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.454680 4903 scope.go:117] "RemoveContainer" containerID="76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.480963 4903 scope.go:117] "RemoveContainer" containerID="7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.498999 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.512929 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7k6p8"] Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.526354 4903 scope.go:117] "RemoveContainer" containerID="d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15" Dec 03 00:16:53 crc kubenswrapper[4903]: E1203 00:16:53.526873 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15\": container with ID starting with d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15 not found: ID does not exist" containerID="d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.526932 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15"} err="failed to get container status \"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15\": rpc error: code = NotFound desc = could not find container \"d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15\": container with ID starting with d034e2905f202abad122344c0d742843f772febd390ba19982efc3cb19a01f15 not found: ID does not exist" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.526972 4903 scope.go:117] "RemoveContainer" containerID="76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae" Dec 03 00:16:53 crc kubenswrapper[4903]: E1203 00:16:53.527428 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae\": container with ID starting with 76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae not found: ID does not exist" containerID="76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.527547 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae"} err="failed to get container status \"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae\": rpc error: code = NotFound desc = could not find container \"76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae\": container with ID starting with 76978ed916f8eebf6bda96906621ac8a564920583c568bfb25978eae29bcc2ae not found: ID does not exist" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.527633 4903 scope.go:117] "RemoveContainer" containerID="7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9" Dec 03 00:16:53 crc kubenswrapper[4903]: E1203 00:16:53.527966 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9\": container with ID starting with 7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9 not found: ID does not exist" containerID="7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.528003 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9"} err="failed to get container status \"7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9\": rpc error: code = NotFound desc = could not find container \"7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9\": container with ID starting with 7e110828c8bd549923075dc3968a30e9ad4151ecce54cc41ef85098cbc8295e9 not found: ID does not exist" Dec 03 00:16:53 crc kubenswrapper[4903]: I1203 00:16:53.628260 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" path="/var/lib/kubelet/pods/2b38f7a5-ca3c-4b8e-8799-4e898b1bda74/volumes" Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.070139 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.070756 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.070802 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.071622 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.071700 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac" gracePeriod=600 Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.743616 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac"} Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.744402 4903 scope.go:117] "RemoveContainer" containerID="eb8a49ce128936523d2bf8bc466a202d1982e1b912db39f549d042e19aa4e176" Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.746855 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac" exitCode=0 Dec 03 00:17:23 crc kubenswrapper[4903]: I1203 00:17:23.746964 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635"} Dec 03 00:19:23 crc kubenswrapper[4903]: I1203 00:19:23.069633 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:19:23 crc kubenswrapper[4903]: I1203 00:19:23.070274 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:19:53 crc kubenswrapper[4903]: I1203 00:19:53.070300 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:19:53 crc kubenswrapper[4903]: I1203 00:19:53.071256 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.069680 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.070275 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.070322 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.071193 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.071260 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" gracePeriod=600 Dec 03 00:20:23 crc kubenswrapper[4903]: E1203 00:20:23.447202 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.514496 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" exitCode=0 Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.514524 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635"} Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.514594 4903 scope.go:117] "RemoveContainer" containerID="3465c99a15cf5800e2351b26d635260db68b3b6687c4024f094da729f8cd3bac" Dec 03 00:20:23 crc kubenswrapper[4903]: I1203 00:20:23.515801 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:20:23 crc kubenswrapper[4903]: E1203 00:20:23.516467 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:20:34 crc kubenswrapper[4903]: I1203 00:20:34.612820 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:20:34 crc kubenswrapper[4903]: E1203 00:20:34.614752 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:20:47 crc kubenswrapper[4903]: I1203 00:20:47.613252 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:20:47 crc kubenswrapper[4903]: E1203 00:20:47.614227 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:20:58 crc kubenswrapper[4903]: I1203 00:20:58.612433 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:20:58 crc kubenswrapper[4903]: E1203 00:20:58.613274 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:21:09 crc kubenswrapper[4903]: I1203 00:21:09.613400 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:21:09 crc kubenswrapper[4903]: E1203 00:21:09.614212 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:21:15 crc kubenswrapper[4903]: E1203 00:21:15.402170 4903 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:35780->38.102.83.39:43931: write tcp 38.102.83.39:35780->38.102.83.39:43931: write: broken pipe Dec 03 00:21:23 crc kubenswrapper[4903]: I1203 00:21:23.612749 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:21:23 crc kubenswrapper[4903]: E1203 00:21:23.613958 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:21:38 crc kubenswrapper[4903]: I1203 00:21:38.614308 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:21:38 crc kubenswrapper[4903]: E1203 00:21:38.615357 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:21:52 crc kubenswrapper[4903]: I1203 00:21:52.613169 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:21:52 crc kubenswrapper[4903]: E1203 00:21:52.613877 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:22:03 crc kubenswrapper[4903]: I1203 00:22:03.612442 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:22:03 crc kubenswrapper[4903]: E1203 00:22:03.613205 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:22:18 crc kubenswrapper[4903]: I1203 00:22:18.612679 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:22:18 crc kubenswrapper[4903]: E1203 00:22:18.613399 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:22:32 crc kubenswrapper[4903]: I1203 00:22:32.612215 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:22:32 crc kubenswrapper[4903]: E1203 00:22:32.612740 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:22:40 crc kubenswrapper[4903]: I1203 00:22:40.769618 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="6eaac3fd-8033-42cd-90c3-5dfac716ae66" containerName="galera" probeResult="failure" output="command timed out" Dec 03 00:22:40 crc kubenswrapper[4903]: I1203 00:22:40.774330 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="6eaac3fd-8033-42cd-90c3-5dfac716ae66" containerName="galera" probeResult="failure" output="command timed out" Dec 03 00:22:47 crc kubenswrapper[4903]: I1203 00:22:47.613140 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:22:47 crc kubenswrapper[4903]: E1203 00:22:47.613920 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:23:02 crc kubenswrapper[4903]: I1203 00:23:02.612260 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:23:02 crc kubenswrapper[4903]: E1203 00:23:02.613050 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:23:13 crc kubenswrapper[4903]: I1203 00:23:13.612835 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:23:13 crc kubenswrapper[4903]: E1203 00:23:13.613671 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:23:24 crc kubenswrapper[4903]: I1203 00:23:24.611962 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:23:24 crc kubenswrapper[4903]: E1203 00:23:24.612727 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:23:36 crc kubenswrapper[4903]: I1203 00:23:36.613804 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:23:36 crc kubenswrapper[4903]: E1203 00:23:36.614513 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:23:48 crc kubenswrapper[4903]: I1203 00:23:48.612947 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:23:48 crc kubenswrapper[4903]: E1203 00:23:48.614309 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:24:03 crc kubenswrapper[4903]: I1203 00:24:03.613422 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:24:03 crc kubenswrapper[4903]: E1203 00:24:03.614447 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:24:17 crc kubenswrapper[4903]: I1203 00:24:17.613529 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:24:17 crc kubenswrapper[4903]: E1203 00:24:17.615073 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:24:29 crc kubenswrapper[4903]: I1203 00:24:29.613436 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:24:29 crc kubenswrapper[4903]: E1203 00:24:29.614793 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:24:41 crc kubenswrapper[4903]: I1203 00:24:41.635374 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:24:41 crc kubenswrapper[4903]: E1203 00:24:41.636740 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:24:56 crc kubenswrapper[4903]: I1203 00:24:56.612810 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:24:56 crc kubenswrapper[4903]: E1203 00:24:56.613629 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:25:10 crc kubenswrapper[4903]: I1203 00:25:10.612686 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:25:10 crc kubenswrapper[4903]: E1203 00:25:10.613402 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:25:24 crc kubenswrapper[4903]: I1203 00:25:24.613028 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:25:25 crc kubenswrapper[4903]: I1203 00:25:25.799915 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f"} Dec 03 00:25:33 crc kubenswrapper[4903]: E1203 00:25:33.064643 4903 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.452s" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.502826 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.503958 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.503973 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.503988 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="extract-content" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.503993 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="extract-content" Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.504019 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="extract-utilities" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504025 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="extract-utilities" Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.504054 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="extract-content" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504060 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="extract-content" Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.504071 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504077 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: E1203 00:26:21.504089 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="extract-utilities" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504094 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="extract-utilities" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504276 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b24a8f-a0b1-4ff5-9bf9-6e5ab436db14" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.504287 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b38f7a5-ca3c-4b8e-8799-4e898b1bda74" containerName="registry-server" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.506323 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.531467 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.655938 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.656254 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.656320 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6h7\" (UniqueName: \"kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.757716 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.757831 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6h7\" (UniqueName: \"kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.757942 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.758231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:21 crc kubenswrapper[4903]: I1203 00:26:21.758495 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:22 crc kubenswrapper[4903]: I1203 00:26:22.169268 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6h7\" (UniqueName: \"kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7\") pod \"redhat-operators-mswhz\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:22 crc kubenswrapper[4903]: I1203 00:26:22.186441 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:22 crc kubenswrapper[4903]: I1203 00:26:22.765156 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:23 crc kubenswrapper[4903]: I1203 00:26:23.607389 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerStarted","Data":"2f561295d3dbea8a5fa686fedfb935ec6c50642271796e3521202c105b281de3"} Dec 03 00:26:23 crc kubenswrapper[4903]: I1203 00:26:23.607816 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerStarted","Data":"f231d7d42b225d6f6abe7a8ac8b42de97d93dbdb2426dbe8a050aa8b7995f181"} Dec 03 00:26:24 crc kubenswrapper[4903]: I1203 00:26:24.618600 4903 generic.go:334] "Generic (PLEG): container finished" podID="73054101-e00e-42ed-aedf-d73d3c84a798" containerID="2f561295d3dbea8a5fa686fedfb935ec6c50642271796e3521202c105b281de3" exitCode=0 Dec 03 00:26:24 crc kubenswrapper[4903]: I1203 00:26:24.618876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerDied","Data":"2f561295d3dbea8a5fa686fedfb935ec6c50642271796e3521202c105b281de3"} Dec 03 00:26:24 crc kubenswrapper[4903]: I1203 00:26:24.621584 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:26:26 crc kubenswrapper[4903]: I1203 00:26:26.641462 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerStarted","Data":"475a01e30ebee9bca5b794fed2cba9e95f8c2c014b69b3b1e78b4dd5e5708355"} Dec 03 00:26:31 crc kubenswrapper[4903]: I1203 00:26:31.704012 4903 generic.go:334] "Generic (PLEG): container finished" podID="73054101-e00e-42ed-aedf-d73d3c84a798" containerID="475a01e30ebee9bca5b794fed2cba9e95f8c2c014b69b3b1e78b4dd5e5708355" exitCode=0 Dec 03 00:26:31 crc kubenswrapper[4903]: I1203 00:26:31.704081 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerDied","Data":"475a01e30ebee9bca5b794fed2cba9e95f8c2c014b69b3b1e78b4dd5e5708355"} Dec 03 00:26:33 crc kubenswrapper[4903]: I1203 00:26:33.725252 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerStarted","Data":"6d40fab2f9b7be5849d68ae97807eed56ba8316329dcfe5e56037e4725b00537"} Dec 03 00:26:33 crc kubenswrapper[4903]: I1203 00:26:33.758721 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mswhz" podStartSLOduration=4.859716284 podStartE2EDuration="12.758693304s" podCreationTimestamp="2025-12-03 00:26:21 +0000 UTC" firstStartedPulling="2025-12-03 00:26:24.621391376 +0000 UTC m=+5323.329945659" lastFinishedPulling="2025-12-03 00:26:32.520368396 +0000 UTC m=+5331.228922679" observedRunningTime="2025-12-03 00:26:33.743476668 +0000 UTC m=+5332.452030991" watchObservedRunningTime="2025-12-03 00:26:33.758693304 +0000 UTC m=+5332.467247627" Dec 03 00:26:42 crc kubenswrapper[4903]: I1203 00:26:42.187424 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:42 crc kubenswrapper[4903]: I1203 00:26:42.187979 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:42 crc kubenswrapper[4903]: I1203 00:26:42.856253 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:42 crc kubenswrapper[4903]: I1203 00:26:42.934047 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:43 crc kubenswrapper[4903]: I1203 00:26:43.101205 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:44 crc kubenswrapper[4903]: I1203 00:26:44.846368 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mswhz" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="registry-server" containerID="cri-o://6d40fab2f9b7be5849d68ae97807eed56ba8316329dcfe5e56037e4725b00537" gracePeriod=2 Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.509138 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.512079 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.547565 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.596671 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.596760 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.596830 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vgb7\" (UniqueName: \"kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.702863 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.702948 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.703015 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vgb7\" (UniqueName: \"kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.704946 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.705219 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.732824 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vgb7\" (UniqueName: \"kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7\") pod \"redhat-marketplace-2dwl6\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.862797 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.882543 4903 generic.go:334] "Generic (PLEG): container finished" podID="73054101-e00e-42ed-aedf-d73d3c84a798" containerID="6d40fab2f9b7be5849d68ae97807eed56ba8316329dcfe5e56037e4725b00537" exitCode=0 Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.882596 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerDied","Data":"6d40fab2f9b7be5849d68ae97807eed56ba8316329dcfe5e56037e4725b00537"} Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.882627 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mswhz" event={"ID":"73054101-e00e-42ed-aedf-d73d3c84a798","Type":"ContainerDied","Data":"f231d7d42b225d6f6abe7a8ac8b42de97d93dbdb2426dbe8a050aa8b7995f181"} Dec 03 00:26:45 crc kubenswrapper[4903]: I1203 00:26:45.882641 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f231d7d42b225d6f6abe7a8ac8b42de97d93dbdb2426dbe8a050aa8b7995f181" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.000249 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.115404 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content\") pod \"73054101-e00e-42ed-aedf-d73d3c84a798\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.115528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities\") pod \"73054101-e00e-42ed-aedf-d73d3c84a798\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.115571 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6h7\" (UniqueName: \"kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7\") pod \"73054101-e00e-42ed-aedf-d73d3c84a798\" (UID: \"73054101-e00e-42ed-aedf-d73d3c84a798\") " Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.117314 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities" (OuterVolumeSpecName: "utilities") pod "73054101-e00e-42ed-aedf-d73d3c84a798" (UID: "73054101-e00e-42ed-aedf-d73d3c84a798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.147816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7" (OuterVolumeSpecName: "kube-api-access-mz6h7") pod "73054101-e00e-42ed-aedf-d73d3c84a798" (UID: "73054101-e00e-42ed-aedf-d73d3c84a798"). InnerVolumeSpecName "kube-api-access-mz6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.217842 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.217873 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6h7\" (UniqueName: \"kubernetes.io/projected/73054101-e00e-42ed-aedf-d73d3c84a798-kube-api-access-mz6h7\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.246277 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.377855 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73054101-e00e-42ed-aedf-d73d3c84a798" (UID: "73054101-e00e-42ed-aedf-d73d3c84a798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.436834 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73054101-e00e-42ed-aedf-d73d3c84a798-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.898377 4903 generic.go:334] "Generic (PLEG): container finished" podID="10df3867-126a-4f7d-b7a3-47f54645f638" containerID="3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917" exitCode=0 Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.898508 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerDied","Data":"3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917"} Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.898574 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerStarted","Data":"2e7c65a876707fe60fbe1ce4bd09d1f88f6965d67f1cb3fd3028c3b9d3f6c439"} Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.898530 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mswhz" Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.952450 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:46 crc kubenswrapper[4903]: I1203 00:26:46.960318 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mswhz"] Dec 03 00:26:47 crc kubenswrapper[4903]: I1203 00:26:47.625552 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" path="/var/lib/kubelet/pods/73054101-e00e-42ed-aedf-d73d3c84a798/volumes" Dec 03 00:26:49 crc kubenswrapper[4903]: I1203 00:26:49.931633 4903 generic.go:334] "Generic (PLEG): container finished" podID="10df3867-126a-4f7d-b7a3-47f54645f638" containerID="0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d" exitCode=0 Dec 03 00:26:49 crc kubenswrapper[4903]: I1203 00:26:49.931685 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerDied","Data":"0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d"} Dec 03 00:26:50 crc kubenswrapper[4903]: I1203 00:26:50.947884 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerStarted","Data":"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd"} Dec 03 00:26:50 crc kubenswrapper[4903]: I1203 00:26:50.978228 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2dwl6" podStartSLOduration=2.53102741 podStartE2EDuration="5.978207405s" podCreationTimestamp="2025-12-03 00:26:45 +0000 UTC" firstStartedPulling="2025-12-03 00:26:46.900211778 +0000 UTC m=+5345.608766071" lastFinishedPulling="2025-12-03 00:26:50.347391793 +0000 UTC m=+5349.055946066" observedRunningTime="2025-12-03 00:26:50.970203051 +0000 UTC m=+5349.678757354" watchObservedRunningTime="2025-12-03 00:26:50.978207405 +0000 UTC m=+5349.686761688" Dec 03 00:26:55 crc kubenswrapper[4903]: I1203 00:26:55.863463 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:55 crc kubenswrapper[4903]: I1203 00:26:55.864994 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:56 crc kubenswrapper[4903]: I1203 00:26:56.123816 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:57 crc kubenswrapper[4903]: I1203 00:26:57.113265 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:57 crc kubenswrapper[4903]: I1203 00:26:57.174049 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.043866 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2dwl6" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="registry-server" containerID="cri-o://ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd" gracePeriod=2 Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.537583 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.628690 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content\") pod \"10df3867-126a-4f7d-b7a3-47f54645f638\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.628879 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities\") pod \"10df3867-126a-4f7d-b7a3-47f54645f638\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.629116 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vgb7\" (UniqueName: \"kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7\") pod \"10df3867-126a-4f7d-b7a3-47f54645f638\" (UID: \"10df3867-126a-4f7d-b7a3-47f54645f638\") " Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.630393 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities" (OuterVolumeSpecName: "utilities") pod "10df3867-126a-4f7d-b7a3-47f54645f638" (UID: "10df3867-126a-4f7d-b7a3-47f54645f638"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.650972 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7" (OuterVolumeSpecName: "kube-api-access-6vgb7") pod "10df3867-126a-4f7d-b7a3-47f54645f638" (UID: "10df3867-126a-4f7d-b7a3-47f54645f638"). InnerVolumeSpecName "kube-api-access-6vgb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.667191 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10df3867-126a-4f7d-b7a3-47f54645f638" (UID: "10df3867-126a-4f7d-b7a3-47f54645f638"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.731343 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.731370 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vgb7\" (UniqueName: \"kubernetes.io/projected/10df3867-126a-4f7d-b7a3-47f54645f638-kube-api-access-6vgb7\") on node \"crc\" DevicePath \"\"" Dec 03 00:26:59 crc kubenswrapper[4903]: I1203 00:26:59.731379 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10df3867-126a-4f7d-b7a3-47f54645f638-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.056338 4903 generic.go:334] "Generic (PLEG): container finished" podID="10df3867-126a-4f7d-b7a3-47f54645f638" containerID="ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd" exitCode=0 Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.056397 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerDied","Data":"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd"} Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.056695 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dwl6" event={"ID":"10df3867-126a-4f7d-b7a3-47f54645f638","Type":"ContainerDied","Data":"2e7c65a876707fe60fbe1ce4bd09d1f88f6965d67f1cb3fd3028c3b9d3f6c439"} Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.056722 4903 scope.go:117] "RemoveContainer" containerID="ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.056422 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dwl6" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.095363 4903 scope.go:117] "RemoveContainer" containerID="0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.097357 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.111787 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dwl6"] Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.149762 4903 scope.go:117] "RemoveContainer" containerID="3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.175597 4903 scope.go:117] "RemoveContainer" containerID="ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd" Dec 03 00:27:00 crc kubenswrapper[4903]: E1203 00:27:00.176372 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd\": container with ID starting with ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd not found: ID does not exist" containerID="ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.176406 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd"} err="failed to get container status \"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd\": rpc error: code = NotFound desc = could not find container \"ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd\": container with ID starting with ee71cbe2fb25cbfe782cf04c78ab17de8887d0d585bc50c14b638bff19b39ffd not found: ID does not exist" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.176427 4903 scope.go:117] "RemoveContainer" containerID="0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d" Dec 03 00:27:00 crc kubenswrapper[4903]: E1203 00:27:00.176742 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d\": container with ID starting with 0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d not found: ID does not exist" containerID="0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.176761 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d"} err="failed to get container status \"0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d\": rpc error: code = NotFound desc = could not find container \"0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d\": container with ID starting with 0a2f0ab3132a869484ccc8c9c264d9bcd2185e3074a71b98ccb4e7e05fa4a10d not found: ID does not exist" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.176773 4903 scope.go:117] "RemoveContainer" containerID="3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917" Dec 03 00:27:00 crc kubenswrapper[4903]: E1203 00:27:00.177042 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917\": container with ID starting with 3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917 not found: ID does not exist" containerID="3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917" Dec 03 00:27:00 crc kubenswrapper[4903]: I1203 00:27:00.177093 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917"} err="failed to get container status \"3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917\": rpc error: code = NotFound desc = could not find container \"3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917\": container with ID starting with 3d608a6ec16dfad35b2a247631b23efcb27d502be64d13c92fdf61a90df79917 not found: ID does not exist" Dec 03 00:27:01 crc kubenswrapper[4903]: I1203 00:27:01.637547 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" path="/var/lib/kubelet/pods/10df3867-126a-4f7d-b7a3-47f54645f638/volumes" Dec 03 00:27:53 crc kubenswrapper[4903]: I1203 00:27:53.069845 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:27:53 crc kubenswrapper[4903]: I1203 00:27:53.070600 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:28:23 crc kubenswrapper[4903]: I1203 00:28:23.069749 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:28:23 crc kubenswrapper[4903]: I1203 00:28:23.070377 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:28:53 crc kubenswrapper[4903]: I1203 00:28:53.069271 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:28:53 crc kubenswrapper[4903]: I1203 00:28:53.069775 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:28:53 crc kubenswrapper[4903]: I1203 00:28:53.069820 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:28:53 crc kubenswrapper[4903]: I1203 00:28:53.073894 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:28:53 crc kubenswrapper[4903]: I1203 00:28:53.074491 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f" gracePeriod=600 Dec 03 00:28:54 crc kubenswrapper[4903]: I1203 00:28:54.353268 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f" exitCode=0 Dec 03 00:28:54 crc kubenswrapper[4903]: I1203 00:28:54.353313 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f"} Dec 03 00:28:54 crc kubenswrapper[4903]: I1203 00:28:54.353668 4903 scope.go:117] "RemoveContainer" containerID="b25c5d8e6359eb4e3f2b0dcbefac7abf53f1327660a93cde6eb36f3c6d18c635" Dec 03 00:28:56 crc kubenswrapper[4903]: I1203 00:28:56.387556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff"} Dec 03 00:29:53 crc kubenswrapper[4903]: I1203 00:29:53.037248 4903 generic.go:334] "Generic (PLEG): container finished" podID="0a6ff673-e552-4ffc-94a5-5b780fa219c0" containerID="cf50554a8f4382b15dbbb7a5f5dfc6d8a0d19bdf7f160546bebe11395ec28781" exitCode=0 Dec 03 00:29:53 crc kubenswrapper[4903]: I1203 00:29:53.037330 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0a6ff673-e552-4ffc-94a5-5b780fa219c0","Type":"ContainerDied","Data":"cf50554a8f4382b15dbbb7a5f5dfc6d8a0d19bdf7f160546bebe11395ec28781"} Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.450964 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646640 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646754 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knf45\" (UniqueName: \"kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646811 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646890 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646917 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.646963 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.647032 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.647056 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.647084 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key\") pod \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\" (UID: \"0a6ff673-e552-4ffc-94a5-5b780fa219c0\") " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.647727 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data" (OuterVolumeSpecName: "config-data") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.647727 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.652429 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.653063 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45" (OuterVolumeSpecName: "kube-api-access-knf45") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "kube-api-access-knf45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.672552 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.687866 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.688155 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.701991 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.706357 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0a6ff673-e552-4ffc-94a5-5b780fa219c0" (UID: "0a6ff673-e552-4ffc-94a5-5b780fa219c0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750071 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750110 4903 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750125 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750151 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750161 4903 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0a6ff673-e552-4ffc-94a5-5b780fa219c0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750171 4903 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750183 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6ff673-e552-4ffc-94a5-5b780fa219c0-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750204 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knf45\" (UniqueName: \"kubernetes.io/projected/0a6ff673-e552-4ffc-94a5-5b780fa219c0-kube-api-access-knf45\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.750216 4903 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0a6ff673-e552-4ffc-94a5-5b780fa219c0-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.775689 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 00:29:54 crc kubenswrapper[4903]: I1203 00:29:54.853081 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 00:29:55 crc kubenswrapper[4903]: I1203 00:29:55.057228 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0a6ff673-e552-4ffc-94a5-5b780fa219c0","Type":"ContainerDied","Data":"7bf52e045ced2e8037d1c0450b6df1545e851abb844570403a5c8f8968b363cc"} Dec 03 00:29:55 crc kubenswrapper[4903]: I1203 00:29:55.057509 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf52e045ced2e8037d1c0450b6df1545e851abb844570403a5c8f8968b363cc" Dec 03 00:29:55 crc kubenswrapper[4903]: I1203 00:29:55.057586 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.154982 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b"] Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156039 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156055 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156077 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156085 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156100 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6ff673-e552-4ffc-94a5-5b780fa219c0" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156109 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6ff673-e552-4ffc-94a5-5b780fa219c0" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156132 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156139 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="extract-utilities" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156149 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156156 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156192 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156200 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: E1203 00:30:00.156212 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156219 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="extract-content" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156464 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6ff673-e552-4ffc-94a5-5b780fa219c0" containerName="tempest-tests-tempest-tests-runner" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156500 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="10df3867-126a-4f7d-b7a3-47f54645f638" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.156518 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="73054101-e00e-42ed-aedf-d73d3c84a798" containerName="registry-server" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.157418 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.162390 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.162707 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.166406 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b"] Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.266557 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.266809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.266872 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglx7\" (UniqueName: \"kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.368642 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.368746 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglx7\" (UniqueName: \"kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.368902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.369895 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.375422 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.389490 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglx7\" (UniqueName: \"kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7\") pod \"collect-profiles-29412030-z8g7b\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.479881 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:00 crc kubenswrapper[4903]: I1203 00:30:00.962890 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b"] Dec 03 00:30:00 crc kubenswrapper[4903]: W1203 00:30:00.975326 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13808157_6cdf_43ba_a42c_b0de09f911dc.slice/crio-dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2 WatchSource:0}: Error finding container dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2: Status 404 returned error can't find the container with id dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2 Dec 03 00:30:01 crc kubenswrapper[4903]: I1203 00:30:01.128193 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" event={"ID":"13808157-6cdf-43ba-a42c-b0de09f911dc","Type":"ContainerStarted","Data":"dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2"} Dec 03 00:30:02 crc kubenswrapper[4903]: I1203 00:30:02.142120 4903 generic.go:334] "Generic (PLEG): container finished" podID="13808157-6cdf-43ba-a42c-b0de09f911dc" containerID="44fce642e67664658fdf7fb3ba67e3293889b94d081dd3416bbb1a1f012245f8" exitCode=0 Dec 03 00:30:02 crc kubenswrapper[4903]: I1203 00:30:02.142181 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" event={"ID":"13808157-6cdf-43ba-a42c-b0de09f911dc","Type":"ContainerDied","Data":"44fce642e67664658fdf7fb3ba67e3293889b94d081dd3416bbb1a1f012245f8"} Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.566675 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.737963 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume\") pod \"13808157-6cdf-43ba-a42c-b0de09f911dc\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.738288 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglx7\" (UniqueName: \"kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7\") pod \"13808157-6cdf-43ba-a42c-b0de09f911dc\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.738516 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume\") pod \"13808157-6cdf-43ba-a42c-b0de09f911dc\" (UID: \"13808157-6cdf-43ba-a42c-b0de09f911dc\") " Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.738553 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "13808157-6cdf-43ba-a42c-b0de09f911dc" (UID: "13808157-6cdf-43ba-a42c-b0de09f911dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.742200 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13808157-6cdf-43ba-a42c-b0de09f911dc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.758950 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7" (OuterVolumeSpecName: "kube-api-access-xglx7") pod "13808157-6cdf-43ba-a42c-b0de09f911dc" (UID: "13808157-6cdf-43ba-a42c-b0de09f911dc"). InnerVolumeSpecName "kube-api-access-xglx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.759492 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13808157-6cdf-43ba-a42c-b0de09f911dc" (UID: "13808157-6cdf-43ba-a42c-b0de09f911dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.846920 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xglx7\" (UniqueName: \"kubernetes.io/projected/13808157-6cdf-43ba-a42c-b0de09f911dc-kube-api-access-xglx7\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:03 crc kubenswrapper[4903]: I1203 00:30:03.846977 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13808157-6cdf-43ba-a42c-b0de09f911dc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:04 crc kubenswrapper[4903]: I1203 00:30:04.163844 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" event={"ID":"13808157-6cdf-43ba-a42c-b0de09f911dc","Type":"ContainerDied","Data":"dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2"} Dec 03 00:30:04 crc kubenswrapper[4903]: I1203 00:30:04.163885 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dca9494b3c9d7a0c027fad8c282ee8ef95c2864d6e356f62916ef947799701a2" Dec 03 00:30:04 crc kubenswrapper[4903]: I1203 00:30:04.163936 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412030-z8g7b" Dec 03 00:30:04 crc kubenswrapper[4903]: I1203 00:30:04.658034 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4"] Dec 03 00:30:04 crc kubenswrapper[4903]: I1203 00:30:04.666951 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411985-xfnl4"] Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.380401 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:30:05 crc kubenswrapper[4903]: E1203 00:30:05.381266 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13808157-6cdf-43ba-a42c-b0de09f911dc" containerName="collect-profiles" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.381312 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="13808157-6cdf-43ba-a42c-b0de09f911dc" containerName="collect-profiles" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.381889 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="13808157-6cdf-43ba-a42c-b0de09f911dc" containerName="collect-profiles" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.383129 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.387555 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-h4xff" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.401152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.478429 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.478518 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf475\" (UniqueName: \"kubernetes.io/projected/950ed1bd-b32c-4e34-b973-3fdb5b2c0383-kube-api-access-vf475\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.580076 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf475\" (UniqueName: \"kubernetes.io/projected/950ed1bd-b32c-4e34-b973-3fdb5b2c0383-kube-api-access-vf475\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.580309 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.580953 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.601561 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf475\" (UniqueName: \"kubernetes.io/projected/950ed1bd-b32c-4e34-b973-3fdb5b2c0383-kube-api-access-vf475\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.634751 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761dc476-edb0-4778-a1a5-6e81140737bc" path="/var/lib/kubelet/pods/761dc476-edb0-4778-a1a5-6e81140737bc/volumes" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.635568 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"950ed1bd-b32c-4e34-b973-3fdb5b2c0383\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:05 crc kubenswrapper[4903]: I1203 00:30:05.706917 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 00:30:06 crc kubenswrapper[4903]: I1203 00:30:06.162177 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 00:30:06 crc kubenswrapper[4903]: I1203 00:30:06.246697 4903 scope.go:117] "RemoveContainer" containerID="98b24543b82d47c74fd99124a4f9d62944172ee58335174566657f2357454b48" Dec 03 00:30:07 crc kubenswrapper[4903]: I1203 00:30:07.199606 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"950ed1bd-b32c-4e34-b973-3fdb5b2c0383","Type":"ContainerStarted","Data":"67cc09f707d895da1bb70433fd2222c7ca608e1b2f5726779dd6b2bcd1694cd1"} Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.226961 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"950ed1bd-b32c-4e34-b973-3fdb5b2c0383","Type":"ContainerStarted","Data":"72affa05a328cdbf136f9e32bd87a1701d484cc1b2a7ae95ad6816000363c23f"} Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.247838 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.795407688 podStartE2EDuration="4.247822343s" podCreationTimestamp="2025-12-03 00:30:05 +0000 UTC" firstStartedPulling="2025-12-03 00:30:06.6897433 +0000 UTC m=+5545.398297593" lastFinishedPulling="2025-12-03 00:30:08.142157945 +0000 UTC m=+5546.850712248" observedRunningTime="2025-12-03 00:30:09.246151233 +0000 UTC m=+5547.954705516" watchObservedRunningTime="2025-12-03 00:30:09.247822343 +0000 UTC m=+5547.956376616" Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.864034 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.867018 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.893751 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.971737 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.972076 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:09 crc kubenswrapper[4903]: I1203 00:30:09.972293 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7cp\" (UniqueName: \"kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.056973 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.059073 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.070204 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.077160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.077239 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7cp\" (UniqueName: \"kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.077284 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.077710 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.077971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.123505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7cp\" (UniqueName: \"kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp\") pod \"certified-operators-xcjkq\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.180239 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k52bw\" (UniqueName: \"kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.180397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.180453 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.227138 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.294497 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k52bw\" (UniqueName: \"kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.294637 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.294725 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.295230 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.295830 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.313872 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k52bw\" (UniqueName: \"kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw\") pod \"community-operators-pfh5b\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.396785 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:10 crc kubenswrapper[4903]: I1203 00:30:10.811947 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.018000 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:11 crc kubenswrapper[4903]: W1203 00:30:11.025393 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f68090_fda2_4b2a_9145_f05651fe3b81.slice/crio-a3c8ec8aeab528af303d898bcd71c4e1ac310adbfcdde2939a09e8518c990c7c WatchSource:0}: Error finding container a3c8ec8aeab528af303d898bcd71c4e1ac310adbfcdde2939a09e8518c990c7c: Status 404 returned error can't find the container with id a3c8ec8aeab528af303d898bcd71c4e1ac310adbfcdde2939a09e8518c990c7c Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.263352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerStarted","Data":"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc"} Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.263700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerStarted","Data":"a3c8ec8aeab528af303d898bcd71c4e1ac310adbfcdde2939a09e8518c990c7c"} Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.267618 4903 generic.go:334] "Generic (PLEG): container finished" podID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerID="aaab28de4e0cf496f2cceafc7ef2627a651b224f890172c82bb99e88273c274a" exitCode=0 Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.267705 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerDied","Data":"aaab28de4e0cf496f2cceafc7ef2627a651b224f890172c82bb99e88273c274a"} Dec 03 00:30:11 crc kubenswrapper[4903]: I1203 00:30:11.267731 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerStarted","Data":"5cef90796374cf8228f9068f30caa49d126aea886946c6f044cf91c83f5fd136"} Dec 03 00:30:12 crc kubenswrapper[4903]: I1203 00:30:12.278048 4903 generic.go:334] "Generic (PLEG): container finished" podID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerID="1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc" exitCode=0 Dec 03 00:30:12 crc kubenswrapper[4903]: I1203 00:30:12.278256 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerDied","Data":"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc"} Dec 03 00:30:13 crc kubenswrapper[4903]: I1203 00:30:13.293162 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerStarted","Data":"97c99e0d6a3a0a590da1f76424c91cafbc2e2796bbac40905a17c3667134560f"} Dec 03 00:30:14 crc kubenswrapper[4903]: I1203 00:30:14.312614 4903 generic.go:334] "Generic (PLEG): container finished" podID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerID="97c99e0d6a3a0a590da1f76424c91cafbc2e2796bbac40905a17c3667134560f" exitCode=0 Dec 03 00:30:14 crc kubenswrapper[4903]: I1203 00:30:14.312684 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerDied","Data":"97c99e0d6a3a0a590da1f76424c91cafbc2e2796bbac40905a17c3667134560f"} Dec 03 00:30:16 crc kubenswrapper[4903]: I1203 00:30:16.337221 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerStarted","Data":"61f2f233e6b2850825ef864e7f07bd28672a80f8199510b4b19b927da80efb09"} Dec 03 00:30:16 crc kubenswrapper[4903]: I1203 00:30:16.340989 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerStarted","Data":"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836"} Dec 03 00:30:16 crc kubenswrapper[4903]: I1203 00:30:16.357352 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xcjkq" podStartSLOduration=3.319772511 podStartE2EDuration="7.357335172s" podCreationTimestamp="2025-12-03 00:30:09 +0000 UTC" firstStartedPulling="2025-12-03 00:30:11.269122788 +0000 UTC m=+5549.977677071" lastFinishedPulling="2025-12-03 00:30:15.306685449 +0000 UTC m=+5554.015239732" observedRunningTime="2025-12-03 00:30:16.352801502 +0000 UTC m=+5555.061355785" watchObservedRunningTime="2025-12-03 00:30:16.357335172 +0000 UTC m=+5555.065889455" Dec 03 00:30:18 crc kubenswrapper[4903]: I1203 00:30:18.362032 4903 generic.go:334] "Generic (PLEG): container finished" podID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerID="b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836" exitCode=0 Dec 03 00:30:18 crc kubenswrapper[4903]: I1203 00:30:18.362067 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerDied","Data":"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836"} Dec 03 00:30:19 crc kubenswrapper[4903]: I1203 00:30:19.379756 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerStarted","Data":"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839"} Dec 03 00:30:19 crc kubenswrapper[4903]: I1203 00:30:19.434543 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pfh5b" podStartSLOduration=2.815132319 podStartE2EDuration="9.434491965s" podCreationTimestamp="2025-12-03 00:30:10 +0000 UTC" firstStartedPulling="2025-12-03 00:30:12.280835552 +0000 UTC m=+5550.989389835" lastFinishedPulling="2025-12-03 00:30:18.900195198 +0000 UTC m=+5557.608749481" observedRunningTime="2025-12-03 00:30:19.404215936 +0000 UTC m=+5558.112770249" watchObservedRunningTime="2025-12-03 00:30:19.434491965 +0000 UTC m=+5558.143046268" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.228000 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.228293 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.397807 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.398097 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.853832 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:20 crc kubenswrapper[4903]: I1203 00:30:20.923182 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:21 crc kubenswrapper[4903]: I1203 00:30:21.833477 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pfh5b" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="registry-server" probeResult="failure" output=< Dec 03 00:30:21 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:30:21 crc kubenswrapper[4903]: > Dec 03 00:30:22 crc kubenswrapper[4903]: I1203 00:30:22.049605 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:22 crc kubenswrapper[4903]: I1203 00:30:22.419429 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xcjkq" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="registry-server" containerID="cri-o://61f2f233e6b2850825ef864e7f07bd28672a80f8199510b4b19b927da80efb09" gracePeriod=2 Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.440883 4903 generic.go:334] "Generic (PLEG): container finished" podID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerID="61f2f233e6b2850825ef864e7f07bd28672a80f8199510b4b19b927da80efb09" exitCode=0 Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.440963 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerDied","Data":"61f2f233e6b2850825ef864e7f07bd28672a80f8199510b4b19b927da80efb09"} Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.441237 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcjkq" event={"ID":"c914d3c4-0cae-4601-a083-8e4b1f84f3be","Type":"ContainerDied","Data":"5cef90796374cf8228f9068f30caa49d126aea886946c6f044cf91c83f5fd136"} Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.441259 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cef90796374cf8228f9068f30caa49d126aea886946c6f044cf91c83f5fd136" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.521409 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.642924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content\") pod \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.643033 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities\") pod \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.643223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7cp\" (UniqueName: \"kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp\") pod \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\" (UID: \"c914d3c4-0cae-4601-a083-8e4b1f84f3be\") " Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.643926 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities" (OuterVolumeSpecName: "utilities") pod "c914d3c4-0cae-4601-a083-8e4b1f84f3be" (UID: "c914d3c4-0cae-4601-a083-8e4b1f84f3be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.649616 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp" (OuterVolumeSpecName: "kube-api-access-bn7cp") pod "c914d3c4-0cae-4601-a083-8e4b1f84f3be" (UID: "c914d3c4-0cae-4601-a083-8e4b1f84f3be"). InnerVolumeSpecName "kube-api-access-bn7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.695088 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c914d3c4-0cae-4601-a083-8e4b1f84f3be" (UID: "c914d3c4-0cae-4601-a083-8e4b1f84f3be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.745565 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.745601 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c914d3c4-0cae-4601-a083-8e4b1f84f3be-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:24 crc kubenswrapper[4903]: I1203 00:30:24.745613 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7cp\" (UniqueName: \"kubernetes.io/projected/c914d3c4-0cae-4601-a083-8e4b1f84f3be-kube-api-access-bn7cp\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:25 crc kubenswrapper[4903]: I1203 00:30:25.454140 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcjkq" Dec 03 00:30:25 crc kubenswrapper[4903]: I1203 00:30:25.526707 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:25 crc kubenswrapper[4903]: I1203 00:30:25.537897 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xcjkq"] Dec 03 00:30:25 crc kubenswrapper[4903]: E1203 00:30:25.603445 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc914d3c4_0cae_4601_a083_8e4b1f84f3be.slice\": RecentStats: unable to find data in memory cache]" Dec 03 00:30:25 crc kubenswrapper[4903]: I1203 00:30:25.630046 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" path="/var/lib/kubelet/pods/c914d3c4-0cae-4601-a083-8e4b1f84f3be/volumes" Dec 03 00:30:30 crc kubenswrapper[4903]: I1203 00:30:30.481404 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:30 crc kubenswrapper[4903]: I1203 00:30:30.568315 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:30 crc kubenswrapper[4903]: I1203 00:30:30.720851 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:31 crc kubenswrapper[4903]: I1203 00:30:31.526929 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pfh5b" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="registry-server" containerID="cri-o://fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839" gracePeriod=2 Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.334950 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.416986 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities\") pod \"b2f68090-fda2-4b2a-9145-f05651fe3b81\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.417060 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k52bw\" (UniqueName: \"kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw\") pod \"b2f68090-fda2-4b2a-9145-f05651fe3b81\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.417260 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content\") pod \"b2f68090-fda2-4b2a-9145-f05651fe3b81\" (UID: \"b2f68090-fda2-4b2a-9145-f05651fe3b81\") " Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.418018 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities" (OuterVolumeSpecName: "utilities") pod "b2f68090-fda2-4b2a-9145-f05651fe3b81" (UID: "b2f68090-fda2-4b2a-9145-f05651fe3b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.477893 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2f68090-fda2-4b2a-9145-f05651fe3b81" (UID: "b2f68090-fda2-4b2a-9145-f05651fe3b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.519272 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.519321 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f68090-fda2-4b2a-9145-f05651fe3b81-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.531925 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw" (OuterVolumeSpecName: "kube-api-access-k52bw") pod "b2f68090-fda2-4b2a-9145-f05651fe3b81" (UID: "b2f68090-fda2-4b2a-9145-f05651fe3b81"). InnerVolumeSpecName "kube-api-access-k52bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.560367 4903 generic.go:334] "Generic (PLEG): container finished" podID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerID="fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839" exitCode=0 Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.560405 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerDied","Data":"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839"} Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.560430 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfh5b" event={"ID":"b2f68090-fda2-4b2a-9145-f05651fe3b81","Type":"ContainerDied","Data":"a3c8ec8aeab528af303d898bcd71c4e1ac310adbfcdde2939a09e8518c990c7c"} Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.560431 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfh5b" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.560447 4903 scope.go:117] "RemoveContainer" containerID="fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.588766 4903 scope.go:117] "RemoveContainer" containerID="b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.607227 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.615881 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pfh5b"] Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.621304 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k52bw\" (UniqueName: \"kubernetes.io/projected/b2f68090-fda2-4b2a-9145-f05651fe3b81-kube-api-access-k52bw\") on node \"crc\" DevicePath \"\"" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.626028 4903 scope.go:117] "RemoveContainer" containerID="1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.670851 4903 scope.go:117] "RemoveContainer" containerID="fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839" Dec 03 00:30:32 crc kubenswrapper[4903]: E1203 00:30:32.671321 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839\": container with ID starting with fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839 not found: ID does not exist" containerID="fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.671371 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839"} err="failed to get container status \"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839\": rpc error: code = NotFound desc = could not find container \"fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839\": container with ID starting with fc138674cb02ad1b22e2a3d6d5ee50382d2302204779acb41d68abd4ce4fc839 not found: ID does not exist" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.671405 4903 scope.go:117] "RemoveContainer" containerID="b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836" Dec 03 00:30:32 crc kubenswrapper[4903]: E1203 00:30:32.671887 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836\": container with ID starting with b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836 not found: ID does not exist" containerID="b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.671933 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836"} err="failed to get container status \"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836\": rpc error: code = NotFound desc = could not find container \"b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836\": container with ID starting with b906625bfbbb28533ca0372dd3575e0a5f3e160deece39d9e1ccc33fbce68836 not found: ID does not exist" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.671959 4903 scope.go:117] "RemoveContainer" containerID="1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc" Dec 03 00:30:32 crc kubenswrapper[4903]: E1203 00:30:32.672248 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc\": container with ID starting with 1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc not found: ID does not exist" containerID="1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc" Dec 03 00:30:32 crc kubenswrapper[4903]: I1203 00:30:32.672300 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc"} err="failed to get container status \"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc\": rpc error: code = NotFound desc = could not find container \"1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc\": container with ID starting with 1e1c28a6a1b0b036243f0c0e88ac6623f2f52ddfc7d8297989bfbe0444cfcbcc not found: ID does not exist" Dec 03 00:30:33 crc kubenswrapper[4903]: I1203 00:30:33.631790 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" path="/var/lib/kubelet/pods/b2f68090-fda2-4b2a-9145-f05651fe3b81/volumes" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.036719 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jn8lr/must-gather-s66vc"] Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037347 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="extract-content" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037366 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="extract-content" Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037377 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="extract-utilities" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037383 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="extract-utilities" Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037395 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037403 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037415 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="extract-content" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037421 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="extract-content" Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037440 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="extract-utilities" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037445 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="extract-utilities" Dec 03 00:30:34 crc kubenswrapper[4903]: E1203 00:30:34.037457 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037463 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037645 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c914d3c4-0cae-4601-a083-8e4b1f84f3be" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.037686 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f68090-fda2-4b2a-9145-f05651fe3b81" containerName="registry-server" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.038757 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.041094 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jn8lr"/"default-dockercfg-rwj2j" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.041101 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jn8lr"/"openshift-service-ca.crt" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.042564 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jn8lr"/"kube-root-ca.crt" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.063888 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jn8lr/must-gather-s66vc"] Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.155317 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9bv\" (UniqueName: \"kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.155741 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.257802 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.257988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9bv\" (UniqueName: \"kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.258243 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.274721 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9bv\" (UniqueName: \"kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv\") pod \"must-gather-s66vc\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.357861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:30:34 crc kubenswrapper[4903]: I1203 00:30:34.860747 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jn8lr/must-gather-s66vc"] Dec 03 00:30:35 crc kubenswrapper[4903]: I1203 00:30:35.597375 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/must-gather-s66vc" event={"ID":"4abb1095-900f-47b9-a0ae-b494f350a421","Type":"ContainerStarted","Data":"f69e19701127f0649bf218b6cd8767fcd475bc3a879219f8dea69d6e2b3d93bd"} Dec 03 00:30:43 crc kubenswrapper[4903]: I1203 00:30:43.680902 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/must-gather-s66vc" event={"ID":"4abb1095-900f-47b9-a0ae-b494f350a421","Type":"ContainerStarted","Data":"f07139e5a9ba9d97b8fab7f50301fd10c25f01b902cfb13f0c42df7856c74fa7"} Dec 03 00:30:43 crc kubenswrapper[4903]: I1203 00:30:43.681356 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/must-gather-s66vc" event={"ID":"4abb1095-900f-47b9-a0ae-b494f350a421","Type":"ContainerStarted","Data":"645428a4e4f373a5fa721152898e146f5302428b4cd7a74f8be838d30562cd64"} Dec 03 00:30:43 crc kubenswrapper[4903]: I1203 00:30:43.728169 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jn8lr/must-gather-s66vc" podStartSLOduration=2.179786768 podStartE2EDuration="9.728142254s" podCreationTimestamp="2025-12-03 00:30:34 +0000 UTC" firstStartedPulling="2025-12-03 00:30:34.872697136 +0000 UTC m=+5573.581251419" lastFinishedPulling="2025-12-03 00:30:42.421052602 +0000 UTC m=+5581.129606905" observedRunningTime="2025-12-03 00:30:43.700463337 +0000 UTC m=+5582.409017610" watchObservedRunningTime="2025-12-03 00:30:43.728142254 +0000 UTC m=+5582.436696557" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.828473 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-jm4qr"] Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.830354 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.854273 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.854686 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mw4z\" (UniqueName: \"kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.955914 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mw4z\" (UniqueName: \"kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.956145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.956292 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:46 crc kubenswrapper[4903]: I1203 00:30:46.978672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mw4z\" (UniqueName: \"kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z\") pod \"crc-debug-jm4qr\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:47 crc kubenswrapper[4903]: I1203 00:30:47.152308 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:30:47 crc kubenswrapper[4903]: I1203 00:30:47.730383 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" event={"ID":"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0","Type":"ContainerStarted","Data":"e401930b777115041c2cec7797358c398a67b34f238aadf491ae66259dfce579"} Dec 03 00:30:57 crc kubenswrapper[4903]: I1203 00:30:57.837825 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" event={"ID":"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0","Type":"ContainerStarted","Data":"3698f6e7b6d7e485d58c0fe1786195bce3dffe2003f08d6eb1bbc08a9371351c"} Dec 03 00:30:57 crc kubenswrapper[4903]: I1203 00:30:57.851643 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" podStartSLOduration=1.612949253 podStartE2EDuration="11.851625999s" podCreationTimestamp="2025-12-03 00:30:46 +0000 UTC" firstStartedPulling="2025-12-03 00:30:47.206247211 +0000 UTC m=+5585.914801494" lastFinishedPulling="2025-12-03 00:30:57.444923957 +0000 UTC m=+5596.153478240" observedRunningTime="2025-12-03 00:30:57.850398299 +0000 UTC m=+5596.558952582" watchObservedRunningTime="2025-12-03 00:30:57.851625999 +0000 UTC m=+5596.560180282" Dec 03 00:31:23 crc kubenswrapper[4903]: I1203 00:31:23.069444 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:31:23 crc kubenswrapper[4903]: I1203 00:31:23.070068 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:31:48 crc kubenswrapper[4903]: I1203 00:31:48.344131 4903 generic.go:334] "Generic (PLEG): container finished" podID="8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" containerID="3698f6e7b6d7e485d58c0fe1786195bce3dffe2003f08d6eb1bbc08a9371351c" exitCode=0 Dec 03 00:31:48 crc kubenswrapper[4903]: I1203 00:31:48.344230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" event={"ID":"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0","Type":"ContainerDied","Data":"3698f6e7b6d7e485d58c0fe1786195bce3dffe2003f08d6eb1bbc08a9371351c"} Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.464201 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.533855 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-jm4qr"] Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.542347 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-jm4qr"] Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.582558 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host\") pod \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.582678 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host" (OuterVolumeSpecName: "host") pod "8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" (UID: "8e3aa572-4edc-4a51-884b-7ad7fe5b76d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.582811 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mw4z\" (UniqueName: \"kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z\") pod \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\" (UID: \"8e3aa572-4edc-4a51-884b-7ad7fe5b76d0\") " Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.583408 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.589316 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z" (OuterVolumeSpecName: "kube-api-access-7mw4z") pod "8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" (UID: "8e3aa572-4edc-4a51-884b-7ad7fe5b76d0"). InnerVolumeSpecName "kube-api-access-7mw4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.624619 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" path="/var/lib/kubelet/pods/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0/volumes" Dec 03 00:31:49 crc kubenswrapper[4903]: I1203 00:31:49.684920 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mw4z\" (UniqueName: \"kubernetes.io/projected/8e3aa572-4edc-4a51-884b-7ad7fe5b76d0-kube-api-access-7mw4z\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.364448 4903 scope.go:117] "RemoveContainer" containerID="3698f6e7b6d7e485d58c0fe1786195bce3dffe2003f08d6eb1bbc08a9371351c" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.364511 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-jm4qr" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.737796 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-4sw66"] Dec 03 00:31:50 crc kubenswrapper[4903]: E1203 00:31:50.738666 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" containerName="container-00" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.738683 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" containerName="container-00" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.738913 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3aa572-4edc-4a51-884b-7ad7fe5b76d0" containerName="container-00" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.739755 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.805922 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.806131 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwvf\" (UniqueName: \"kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.908433 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwvf\" (UniqueName: \"kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.908681 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.908927 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:50 crc kubenswrapper[4903]: I1203 00:31:50.928297 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwvf\" (UniqueName: \"kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf\") pod \"crc-debug-4sw66\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:51 crc kubenswrapper[4903]: I1203 00:31:51.059377 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:51 crc kubenswrapper[4903]: I1203 00:31:51.374585 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" event={"ID":"31bf66c4-67b3-4a69-bbe7-55c27b1302f2","Type":"ContainerStarted","Data":"80681280341d4659b0a4cc9a42939f0db03d19f5dddf37871b5a6c9b539e976a"} Dec 03 00:31:51 crc kubenswrapper[4903]: I1203 00:31:51.374630 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" event={"ID":"31bf66c4-67b3-4a69-bbe7-55c27b1302f2","Type":"ContainerStarted","Data":"89ae531cdf8662392bfb8a7a2954dba98e1c5a47d54d5cb43b034785ffc941c2"} Dec 03 00:31:51 crc kubenswrapper[4903]: I1203 00:31:51.392589 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" podStartSLOduration=1.392567746 podStartE2EDuration="1.392567746s" podCreationTimestamp="2025-12-03 00:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:31:51.387878844 +0000 UTC m=+5650.096433127" watchObservedRunningTime="2025-12-03 00:31:51.392567746 +0000 UTC m=+5650.101122029" Dec 03 00:31:52 crc kubenswrapper[4903]: I1203 00:31:52.385718 4903 generic.go:334] "Generic (PLEG): container finished" podID="31bf66c4-67b3-4a69-bbe7-55c27b1302f2" containerID="80681280341d4659b0a4cc9a42939f0db03d19f5dddf37871b5a6c9b539e976a" exitCode=0 Dec 03 00:31:52 crc kubenswrapper[4903]: I1203 00:31:52.385761 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" event={"ID":"31bf66c4-67b3-4a69-bbe7-55c27b1302f2","Type":"ContainerDied","Data":"80681280341d4659b0a4cc9a42939f0db03d19f5dddf37871b5a6c9b539e976a"} Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.069223 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.069803 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.512158 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.671241 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host\") pod \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.671324 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host" (OuterVolumeSpecName: "host") pod "31bf66c4-67b3-4a69-bbe7-55c27b1302f2" (UID: "31bf66c4-67b3-4a69-bbe7-55c27b1302f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.671366 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkwvf\" (UniqueName: \"kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf\") pod \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\" (UID: \"31bf66c4-67b3-4a69-bbe7-55c27b1302f2\") " Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.671897 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.685901 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf" (OuterVolumeSpecName: "kube-api-access-pkwvf") pod "31bf66c4-67b3-4a69-bbe7-55c27b1302f2" (UID: "31bf66c4-67b3-4a69-bbe7-55c27b1302f2"). InnerVolumeSpecName "kube-api-access-pkwvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.774365 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkwvf\" (UniqueName: \"kubernetes.io/projected/31bf66c4-67b3-4a69-bbe7-55c27b1302f2-kube-api-access-pkwvf\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.832781 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-4sw66"] Dec 03 00:31:53 crc kubenswrapper[4903]: I1203 00:31:53.844508 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-4sw66"] Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.407885 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89ae531cdf8662392bfb8a7a2954dba98e1c5a47d54d5cb43b034785ffc941c2" Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.407987 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-4sw66" Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.989773 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-f6v5h"] Dec 03 00:31:54 crc kubenswrapper[4903]: E1203 00:31:54.990293 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bf66c4-67b3-4a69-bbe7-55c27b1302f2" containerName="container-00" Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.990309 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bf66c4-67b3-4a69-bbe7-55c27b1302f2" containerName="container-00" Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.990584 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bf66c4-67b3-4a69-bbe7-55c27b1302f2" containerName="container-00" Dec 03 00:31:54 crc kubenswrapper[4903]: I1203 00:31:54.991462 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.105437 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vfh\" (UniqueName: \"kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.105539 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.209373 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vfh\" (UniqueName: \"kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.209442 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.209695 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.238118 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vfh\" (UniqueName: \"kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh\") pod \"crc-debug-f6v5h\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.313962 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.424737 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" event={"ID":"3fc1c57b-ee2e-422d-9b28-225328734b9a","Type":"ContainerStarted","Data":"596804b19288c7d127d3d2d81d958fc2dfb7a2dd3881d6b975449044d3b4ed10"} Dec 03 00:31:55 crc kubenswrapper[4903]: I1203 00:31:55.625888 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bf66c4-67b3-4a69-bbe7-55c27b1302f2" path="/var/lib/kubelet/pods/31bf66c4-67b3-4a69-bbe7-55c27b1302f2/volumes" Dec 03 00:31:56 crc kubenswrapper[4903]: I1203 00:31:56.442187 4903 generic.go:334] "Generic (PLEG): container finished" podID="3fc1c57b-ee2e-422d-9b28-225328734b9a" containerID="8dda7f8961232a1307cdb24ffe0a34c46ef8f14512508fe211e16ba8acdb959c" exitCode=0 Dec 03 00:31:56 crc kubenswrapper[4903]: I1203 00:31:56.442293 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" event={"ID":"3fc1c57b-ee2e-422d-9b28-225328734b9a","Type":"ContainerDied","Data":"8dda7f8961232a1307cdb24ffe0a34c46ef8f14512508fe211e16ba8acdb959c"} Dec 03 00:31:56 crc kubenswrapper[4903]: I1203 00:31:56.499335 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-f6v5h"] Dec 03 00:31:56 crc kubenswrapper[4903]: I1203 00:31:56.530169 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jn8lr/crc-debug-f6v5h"] Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.576841 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.659559 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2vfh\" (UniqueName: \"kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh\") pod \"3fc1c57b-ee2e-422d-9b28-225328734b9a\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.659635 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host\") pod \"3fc1c57b-ee2e-422d-9b28-225328734b9a\" (UID: \"3fc1c57b-ee2e-422d-9b28-225328734b9a\") " Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.660225 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host" (OuterVolumeSpecName: "host") pod "3fc1c57b-ee2e-422d-9b28-225328734b9a" (UID: "3fc1c57b-ee2e-422d-9b28-225328734b9a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.665850 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh" (OuterVolumeSpecName: "kube-api-access-q2vfh") pod "3fc1c57b-ee2e-422d-9b28-225328734b9a" (UID: "3fc1c57b-ee2e-422d-9b28-225328734b9a"). InnerVolumeSpecName "kube-api-access-q2vfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.762454 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2vfh\" (UniqueName: \"kubernetes.io/projected/3fc1c57b-ee2e-422d-9b28-225328734b9a-kube-api-access-q2vfh\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:57 crc kubenswrapper[4903]: I1203 00:31:57.762491 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fc1c57b-ee2e-422d-9b28-225328734b9a-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:31:58 crc kubenswrapper[4903]: I1203 00:31:58.461848 4903 scope.go:117] "RemoveContainer" containerID="8dda7f8961232a1307cdb24ffe0a34c46ef8f14512508fe211e16ba8acdb959c" Dec 03 00:31:58 crc kubenswrapper[4903]: I1203 00:31:58.461941 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/crc-debug-f6v5h" Dec 03 00:31:59 crc kubenswrapper[4903]: I1203 00:31:59.629253 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc1c57b-ee2e-422d-9b28-225328734b9a" path="/var/lib/kubelet/pods/3fc1c57b-ee2e-422d-9b28-225328734b9a/volumes" Dec 03 00:32:21 crc kubenswrapper[4903]: I1203 00:32:21.527000 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78cfc5fdf8-p9576_1a80c66a-4cfd-44a2-a5e4-5a9297e63f29/barbican-api/0.log" Dec 03 00:32:21 crc kubenswrapper[4903]: I1203 00:32:21.653222 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78cfc5fdf8-p9576_1a80c66a-4cfd-44a2-a5e4-5a9297e63f29/barbican-api-log/0.log" Dec 03 00:32:21 crc kubenswrapper[4903]: I1203 00:32:21.755052 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9559fbfd6-k4fwk_c180d7c5-ad61-4190-b709-6efe6a9a2434/barbican-keystone-listener/0.log" Dec 03 00:32:21 crc kubenswrapper[4903]: I1203 00:32:21.825772 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9559fbfd6-k4fwk_c180d7c5-ad61-4190-b709-6efe6a9a2434/barbican-keystone-listener-log/0.log" Dec 03 00:32:21 crc kubenswrapper[4903]: I1203 00:32:21.954419 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65cf4c8457-6ff7v_6ea83627-fed8-458c-a39b-f73e682799d3/barbican-worker-log/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.009323 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65cf4c8457-6ff7v_6ea83627-fed8-458c-a39b-f73e682799d3/barbican-worker/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.255169 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc_270d4936-772f-40a2-8da3-f2651a216d6b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.489805 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/ceilometer-central-agent/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.510829 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/proxy-httpd/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.558468 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/ceilometer-notification-agent/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.604921 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/sg-core/0.log" Dec 03 00:32:22 crc kubenswrapper[4903]: I1203 00:32:22.811853 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f98dfcd8-1365-42c3-b939-c34ad3325a09/cinder-api-log/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.069498 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.069852 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.069900 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.071974 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.072026 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" gracePeriod=600 Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.076409 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_864e9292-f08c-493e-8110-5ec88083fde2/probe/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: E1203 00:32:23.200012 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.243245 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29412001-pbz9p_334ce527-c86f-4991-bb5a-bb31f27acee1/cinder-db-purge/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.310734 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_864e9292-f08c-493e-8110-5ec88083fde2/cinder-backup/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.315333 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f98dfcd8-1365-42c3-b939-c34ad3325a09/cinder-api/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.447894 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_abeb15a2-9a82-49c1-bfdc-bc65cd1920f0/cinder-scheduler/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.543168 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_abeb15a2-9a82-49c1-bfdc-bc65cd1920f0/probe/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.674136 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8/probe/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.722760 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" exitCode=0 Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.722806 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff"} Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.722839 4903 scope.go:117] "RemoveContainer" containerID="77d928de1c8f3b8e9d9e9ec7d1938486764dc793f3dd69fd2d6bd21ef010f43f" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.723558 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:32:23 crc kubenswrapper[4903]: E1203 00:32:23.723925 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.786746 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8/cinder-volume/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.954196 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_ce4112ef-fcb6-4722-acd0-45bf409867a7/cinder-volume/0.log" Dec 03 00:32:23 crc kubenswrapper[4903]: I1203 00:32:23.963498 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_ce4112ef-fcb6-4722-acd0-45bf409867a7/probe/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.017735 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq_c6f7512f-83fe-4921-9ccf-17a76752819f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.229779 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv_d6700daa-2dac-4779-a463-6aea7ae0d54a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.289531 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/init/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.450978 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/init/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.566512 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7_8dff062c-2479-4ea6-994e-fea352cdf518/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.635870 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/dnsmasq-dns/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.761333 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29412001-9nk2t_ce1d9817-bff6-40a4-bc9b-fcbd1510739c/glance-dbpurge/0.log" Dec 03 00:32:24 crc kubenswrapper[4903]: I1203 00:32:24.839337 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ddc0e105-7645-48dc-9450-661c4ca40b01/glance-httpd/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.040079 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ddc0e105-7645-48dc-9450-661c4ca40b01/glance-log/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.106501 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3b2ebc8f-392e-4650-a033-a23cbe91436e/glance-log/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.112984 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3b2ebc8f-392e-4650-a033-a23cbe91436e/glance-httpd/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.309526 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7857f5d94d-4lclz_c5d26e7e-b21c-4e31-984f-768ef66e0772/horizon/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.372071 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pc895_b8a8af95-c502-4b50-a90e-682b039c6e58/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.585061 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p8wmx_346ac594-16d6-478e-9ce4-4d4acb116a99/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:25 crc kubenswrapper[4903]: I1203 00:32:25.798985 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412001-6287z_c445dbad-15ca-4171-ac03-0fd37dbdd474/keystone-cron/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.053339 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8a0962e8-541d-4a75-b629-613d6d19f47e/kube-state-metrics/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.173312 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7857f5d94d-4lclz_c5d26e7e-b21c-4e31-984f-768ef66e0772/horizon-log/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.190130 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6s9st_c9427e93-561b-4f09-bcec-00c7001f2541/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.329685 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74c5f59c6f-5gx9d_d147d6c4-c17d-4e73-b8a3-efd87eb47f76/keystone-api/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.668623 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz_c0b03ee1-07d8-4d8e-b047-480a4dd369f0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.759369 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f9c8dcd5-hbd9l_c7517345-0440-461c-a78d-a29ef04ecf9c/neutron-api/0.log" Dec 03 00:32:26 crc kubenswrapper[4903]: I1203 00:32:26.767210 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f9c8dcd5-hbd9l_c7517345-0440-461c-a78d-a29ef04ecf9c/neutron-httpd/0.log" Dec 03 00:32:27 crc kubenswrapper[4903]: I1203 00:32:27.337733 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_de5b4dd8-9abd-423d-af40-fed7d5fc1de0/nova-cell0-conductor-conductor/0.log" Dec 03 00:32:27 crc kubenswrapper[4903]: I1203 00:32:27.441668 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29412000-xgfnd_f8499f90-daef-4c46-90ef-36aba9557136/nova-manage/0.log" Dec 03 00:32:27 crc kubenswrapper[4903]: I1203 00:32:27.782907 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_66f413c7-2056-4a28-bf9f-9606dcaa5f78/nova-cell1-conductor-conductor/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.004185 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29412000-lw5h5_2131c673-5399-4093-92fd-c63b4ce2a8a5/nova-manage/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.405242 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_59776a3d-ba94-467b-9b25-2391269821e3/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.413621 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7235be43-b81b-4894-a75b-4c8444482eba/nova-api-log/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.588910 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7235be43-b81b-4894-a75b-4c8444482eba/nova-api-api/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.624492 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-m6rln_013ce0d7-062b-47a7-8831-912380a94a37/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:28 crc kubenswrapper[4903]: I1203 00:32:28.751560 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a0018a95-dc74-4511-ade4-c77e4846f0a0/nova-metadata-log/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.167441 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/mysql-bootstrap/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.190118 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dfe7a458-659b-465b-8ab9-712e3a865820/nova-scheduler-scheduler/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.416851 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/galera/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.426112 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/mysql-bootstrap/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.584622 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/mysql-bootstrap/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.827428 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/galera/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.844304 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/mysql-bootstrap/0.log" Dec 03 00:32:29 crc kubenswrapper[4903]: I1203 00:32:29.999266 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_01e0132f-dfe4-4d3a-9a72-b38b77521ada/openstackclient/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.074775 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lkt78_d72fba58-af32-4b1a-a883-4e76ec6dc3f4/ovn-controller/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.307095 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9fmcf_d7a25811-66de-4b62-ad27-f01f63f539a1/openstack-network-exporter/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.475443 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server-init/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.637199 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server-init/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.723293 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server/0.log" Dec 03 00:32:30 crc kubenswrapper[4903]: I1203 00:32:30.977362 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hbltm_08be3078-8019-4472-8260-d24032d74b39/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.000433 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a0018a95-dc74-4511-ade4-c77e4846f0a0/nova-metadata-metadata/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.113771 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovs-vswitchd/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.192734 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adcf8345-41bb-495c-a006-573f6afe5af9/ovn-northd/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.212701 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adcf8345-41bb-495c-a006-573f6afe5af9/openstack-network-exporter/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.386137 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_635dddd5-1a09-4f9e-b82f-e45eee76b412/openstack-network-exporter/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.390276 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_635dddd5-1a09-4f9e-b82f-e45eee76b412/ovsdbserver-nb/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.568928 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4ab22df5-5c0a-42c6-a881-4529dd331e5f/openstack-network-exporter/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.638280 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4ab22df5-5c0a-42c6-a881-4529dd331e5f/ovsdbserver-sb/0.log" Dec 03 00:32:31 crc kubenswrapper[4903]: I1203 00:32:31.811111 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-665fcbdbd4-lvt55_4b492cef-e99c-4d41-a42b-7377908b5eed/placement-api/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.247219 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/init-config-reloader/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.247298 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/init-config-reloader/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.247455 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/config-reloader/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.391109 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-665fcbdbd4-lvt55_4b492cef-e99c-4d41-a42b-7377908b5eed/placement-log/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.423498 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/prometheus/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.518780 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/thanos-sidecar/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.619864 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/setup-container/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.875243 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/setup-container/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.879229 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/rabbitmq/0.log" Dec 03 00:32:32 crc kubenswrapper[4903]: I1203 00:32:32.879402 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/setup-container/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.157065 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/setup-container/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.228537 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/rabbitmq/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.250009 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/setup-container/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.500823 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/rabbitmq/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.528336 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/setup-container/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.551684 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8_88d8ef39-c7d5-45d4-bd56-fbb4a23d0678/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.730981 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf_d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.763453 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-crjgz_b5c4ae7e-90d7-4090-9357-77e09a38d4f6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:33 crc kubenswrapper[4903]: I1203 00:32:33.989000 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jhx2v_0b801338-6fdb-42ad-b3f8-67b296c04efd/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.109473 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5s77w_70d917fc-dbd8-499d-bcae-b5f324de77cb/ssh-known-hosts-edpm-deployment/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.308920 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5489fffdb5-zmhmz_0679a7f8-6bae-4619-b633-ae583358eda7/proxy-server/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.554994 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5489fffdb5-zmhmz_0679a7f8-6bae-4619-b633-ae583358eda7/proxy-httpd/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.630068 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gkjrc_f16a381a-80d3-4a60-be1b-e782dab1c73c/swift-ring-rebalance/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.734691 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-auditor/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.769501 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-reaper/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.859852 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-replicator/0.log" Dec 03 00:32:34 crc kubenswrapper[4903]: I1203 00:32:34.948961 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-server/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.011904 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-auditor/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.029122 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-replicator/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.062515 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-server/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.122688 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-updater/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.248909 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-auditor/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.292525 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-expirer/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.292807 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-replicator/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.324405 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-server/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.485980 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-updater/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.508080 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/rsync/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.556684 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/swift-recon-cron/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.612819 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:32:35 crc kubenswrapper[4903]: E1203 00:32:35.613101 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.748902 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fkks7_2a2b87ac-e673-475f-9ebc-d3387b0e26f2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.864929 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0a6ff673-e552-4ffc-94a5-5b780fa219c0/tempest-tests-tempest-tests-runner/0.log" Dec 03 00:32:35 crc kubenswrapper[4903]: I1203 00:32:35.903225 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_950ed1bd-b32c-4e34-b973-3fdb5b2c0383/test-operator-logs-container/0.log" Dec 03 00:32:36 crc kubenswrapper[4903]: I1203 00:32:36.080007 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xf62w_d727ee19-e1d6-4421-9be6-94f429f93494/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:32:37 crc kubenswrapper[4903]: I1203 00:32:37.001262 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_6b000454-0ec3-4f51-ba7a-767530eaf03c/watcher-applier/0.log" Dec 03 00:32:37 crc kubenswrapper[4903]: I1203 00:32:37.743576 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_57a9a701-de78-4dc2-b8a7-365cd41a5693/watcher-api-log/0.log" Dec 03 00:32:40 crc kubenswrapper[4903]: I1203 00:32:40.587555 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_d92eb92f-06d0-4676-9c0f-9f3e427ae019/watcher-decision-engine/0.log" Dec 03 00:32:41 crc kubenswrapper[4903]: I1203 00:32:41.627967 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_57a9a701-de78-4dc2-b8a7-365cd41a5693/watcher-api/0.log" Dec 03 00:32:50 crc kubenswrapper[4903]: I1203 00:32:50.612181 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:32:50 crc kubenswrapper[4903]: E1203 00:32:50.612872 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:32:55 crc kubenswrapper[4903]: I1203 00:32:55.733716 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4fdb728a-100d-425d-b83c-245c770afa4b/memcached/0.log" Dec 03 00:33:02 crc kubenswrapper[4903]: I1203 00:33:02.612845 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:33:02 crc kubenswrapper[4903]: E1203 00:33:02.613719 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:33:07 crc kubenswrapper[4903]: I1203 00:33:07.075161 4903 scope.go:117] "RemoveContainer" containerID="6d40fab2f9b7be5849d68ae97807eed56ba8316329dcfe5e56037e4725b00537" Dec 03 00:33:07 crc kubenswrapper[4903]: I1203 00:33:07.097714 4903 scope.go:117] "RemoveContainer" containerID="475a01e30ebee9bca5b794fed2cba9e95f8c2c014b69b3b1e78b4dd5e5708355" Dec 03 00:33:07 crc kubenswrapper[4903]: I1203 00:33:07.119671 4903 scope.go:117] "RemoveContainer" containerID="2f561295d3dbea8a5fa686fedfb935ec6c50642271796e3521202c105b281de3" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.332900 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-tsj9r_58ddb811-8791-4420-ae35-b3521289b565/kube-rbac-proxy/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.458782 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-tsj9r_58ddb811-8791-4420-ae35-b3521289b565/manager/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.568256 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-j2zhw_35bd5361-6683-4c7d-b26c-3cac8e7a5bf4/kube-rbac-proxy/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.659386 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-j2zhw_35bd5361-6683-4c7d-b26c-3cac8e7a5bf4/manager/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.767764 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.946729 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.950022 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:33:08 crc kubenswrapper[4903]: I1203 00:33:08.997919 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.361701 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.431192 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/extract/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.431987 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.549442 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-ddkpk_8f5feda5-281a-4c4f-be95-7b96ecc273f9/kube-rbac-proxy/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.602863 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-ddkpk_8f5feda5-281a-4c4f-be95-7b96ecc273f9/manager/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.650745 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-78vhb_d0be2ea9-978d-4c79-a623-3b752547d546/kube-rbac-proxy/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.795857 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-78vhb_d0be2ea9-978d-4c79-a623-3b752547d546/manager/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.829261 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6dsjr_5046b326-aad3-4aa9-ad84-96b3943a6147/kube-rbac-proxy/0.log" Dec 03 00:33:09 crc kubenswrapper[4903]: I1203 00:33:09.882550 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6dsjr_5046b326-aad3-4aa9-ad84-96b3943a6147/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.014695 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lwgx2_d3c55b89-b070-410d-8436-a101b0f313cf/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.073065 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lwgx2_d3c55b89-b070-410d-8436-a101b0f313cf/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.247822 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dw6n2_e4de4a7c-49fd-48bc-8d5b-75727e7388de/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.351009 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-hj5mh_e3082dc8-ebbf-4a01-9120-5f1081af7801/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.412406 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dw6n2_e4de4a7c-49fd-48bc-8d5b-75727e7388de/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.435739 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-hj5mh_e3082dc8-ebbf-4a01-9120-5f1081af7801/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.524743 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s97rj_5c4ccdc6-6205-4108-9146-75a7a963732e/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.651557 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s97rj_5c4ccdc6-6205-4108-9146-75a7a963732e/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.731681 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhm6n_7c596dd6-5f26-4bb7-a771-8c1d57129209/manager/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.750211 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhm6n_7c596dd6-5f26-4bb7-a771-8c1d57129209/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.868185 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-6fhjd_7367c4a1-c098-4811-80ba-455509d27216/kube-rbac-proxy/0.log" Dec 03 00:33:10 crc kubenswrapper[4903]: I1203 00:33:10.939377 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-6fhjd_7367c4a1-c098-4811-80ba-455509d27216/manager/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.059533 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gm6wm_723460ec-3116-468b-a628-1b03f5fd4239/kube-rbac-proxy/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.117533 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gm6wm_723460ec-3116-468b-a628-1b03f5fd4239/manager/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.133113 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-phs84_d2216dc0-19da-4872-8e82-579f6bd60513/kube-rbac-proxy/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.330826 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-phs84_d2216dc0-19da-4872-8e82-579f6bd60513/manager/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.355074 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2rx8r_fc491fc5-9e88-4e1d-9848-ea8846acd82b/kube-rbac-proxy/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.367010 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2rx8r_fc491fc5-9e88-4e1d-9848-ea8846acd82b/manager/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.542446 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr_5a01f2d2-8c90-4ccc-bf47-a4f973276988/kube-rbac-proxy/0.log" Dec 03 00:33:11 crc kubenswrapper[4903]: I1203 00:33:11.542629 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr_5a01f2d2-8c90-4ccc-bf47-a4f973276988/manager/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.004395 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-756b77799f-tcscw_de3babfe-054a-424f-8b40-e4e43d5f3e5b/operator/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.113307 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-scz9b_1e4a5768-36c7-4a71-8bf1-57f9ff69b940/registry-server/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.263618 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-htwmh_926767ef-1626-42a1-bd04-6d3f06d89f08/kube-rbac-proxy/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.406275 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-htwmh_926767ef-1626-42a1-bd04-6d3f06d89f08/manager/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.468137 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-j5jc6_246fe719-e899-408b-a962-702c5db22bfc/kube-rbac-proxy/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.506695 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-j5jc6_246fe719-e899-408b-a962-702c5db22bfc/manager/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.739838 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5pdxv_057a4ce0-614e-436a-aaf5-300d5ce6661c/kube-rbac-proxy/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.740162 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wccsg_9cc67e14-1cb4-497f-b0f8-010c2e6d5717/operator/0.log" Dec 03 00:33:12 crc kubenswrapper[4903]: I1203 00:33:12.908862 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-k7zsl_5430813d-ed61-496d-86b6-c9cc1d48aa1f/kube-rbac-proxy/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.033093 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5pdxv_057a4ce0-614e-436a-aaf5-300d5ce6661c/manager/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.173586 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-k7zsl_5430813d-ed61-496d-86b6-c9cc1d48aa1f/manager/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.188362 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b9bc7567f-6prdr_61b2d273-f604-4fa0-baba-27dfbab9a350/manager/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.220727 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wp9kf_e6b63e17-4749-429b-8214-92fa7eecfd3c/kube-rbac-proxy/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.283943 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wp9kf_e6b63e17-4749-429b-8214-92fa7eecfd3c/manager/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.370462 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-c95d55f7c-jb8p7_dbed5f2e-6049-4adc-a31c-bad1f30c7058/kube-rbac-proxy/0.log" Dec 03 00:33:13 crc kubenswrapper[4903]: I1203 00:33:13.449142 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-c95d55f7c-jb8p7_dbed5f2e-6049-4adc-a31c-bad1f30c7058/manager/0.log" Dec 03 00:33:14 crc kubenswrapper[4903]: I1203 00:33:14.612732 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:33:14 crc kubenswrapper[4903]: E1203 00:33:14.612991 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:33:26 crc kubenswrapper[4903]: I1203 00:33:26.648868 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:33:26 crc kubenswrapper[4903]: E1203 00:33:26.649788 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:33:31 crc kubenswrapper[4903]: I1203 00:33:31.849932 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gdr6w_ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc/control-plane-machine-set-operator/0.log" Dec 03 00:33:32 crc kubenswrapper[4903]: I1203 00:33:32.023914 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvnkb_ec790a4a-c562-4035-ba10-9ac0c8baf6c6/kube-rbac-proxy/0.log" Dec 03 00:33:32 crc kubenswrapper[4903]: I1203 00:33:32.054110 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvnkb_ec790a4a-c562-4035-ba10-9ac0c8baf6c6/machine-api-operator/0.log" Dec 03 00:33:41 crc kubenswrapper[4903]: I1203 00:33:41.612580 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:33:41 crc kubenswrapper[4903]: E1203 00:33:41.613361 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:33:44 crc kubenswrapper[4903]: I1203 00:33:44.601901 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-98zmz_245517d6-a256-48d4-8140-bd54f1794279/cert-manager-controller/0.log" Dec 03 00:33:44 crc kubenswrapper[4903]: I1203 00:33:44.813027 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qk25x_b4744872-fc92-4dc7-b64f-dbdc3c32c890/cert-manager-cainjector/0.log" Dec 03 00:33:44 crc kubenswrapper[4903]: I1203 00:33:44.924549 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-tq5jc_62990e39-3700-4b20-9668-d90e0074a402/cert-manager-webhook/0.log" Dec 03 00:33:56 crc kubenswrapper[4903]: I1203 00:33:56.613011 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:33:56 crc kubenswrapper[4903]: E1203 00:33:56.613784 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.238897 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-wwvz6_ef8f8d87-b435-4583-aa22-2e43892ce34b/nmstate-console-plugin/0.log" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.433038 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-smkbq_e904a523-8784-443d-b994-bb1aa11e45f4/nmstate-handler/0.log" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.486132 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8qx6t_09e08c1d-fdea-4255-accb-8c957d34cfa3/nmstate-metrics/0.log" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.495876 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8qx6t_09e08c1d-fdea-4255-accb-8c957d34cfa3/kube-rbac-proxy/0.log" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.626443 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gf4p6_524a5581-af2a-48b9-abd3-2f7c2d046b83/nmstate-operator/0.log" Dec 03 00:33:58 crc kubenswrapper[4903]: I1203 00:33:58.774594 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-5lv4b_74c7c4ac-2173-497a-b630-d905326c4749/nmstate-webhook/0.log" Dec 03 00:34:09 crc kubenswrapper[4903]: I1203 00:34:09.612589 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:34:09 crc kubenswrapper[4903]: E1203 00:34:09.613572 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.590383 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-b7sxc_3503b383-bf2b-4c83-8a43-3323f7330880/kube-rbac-proxy/0.log" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.650606 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-b7sxc_3503b383-bf2b-4c83-8a43-3323f7330880/controller/0.log" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.748177 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.960790 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.967734 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:34:13 crc kubenswrapper[4903]: I1203 00:34:13.983355 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.014363 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.180856 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.203585 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.208458 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.208990 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.397378 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.417409 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.419933 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.455625 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/controller/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.571368 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/frr-metrics/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.589076 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/kube-rbac-proxy/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.641162 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/kube-rbac-proxy-frr/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.783713 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/reloader/0.log" Dec 03 00:34:14 crc kubenswrapper[4903]: I1203 00:34:14.876180 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-6zx67_b8787de7-b1d1-41fc-bda7-628c8916c8c7/frr-k8s-webhook-server/0.log" Dec 03 00:34:15 crc kubenswrapper[4903]: I1203 00:34:15.102899 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b49745895-c8xsg_1548db98-cd62-4c58-88ac-4f4de9512edb/manager/0.log" Dec 03 00:34:15 crc kubenswrapper[4903]: I1203 00:34:15.203029 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64ddb78498-frglc_2f9aa142-f989-4748-946c-7629a225d6a4/webhook-server/0.log" Dec 03 00:34:15 crc kubenswrapper[4903]: I1203 00:34:15.385048 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pm26w_c48c624c-4ecb-47d7-affb-bf5527eec659/kube-rbac-proxy/0.log" Dec 03 00:34:16 crc kubenswrapper[4903]: I1203 00:34:16.050199 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pm26w_c48c624c-4ecb-47d7-affb-bf5527eec659/speaker/0.log" Dec 03 00:34:16 crc kubenswrapper[4903]: I1203 00:34:16.388563 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/frr/0.log" Dec 03 00:34:20 crc kubenswrapper[4903]: I1203 00:34:20.611992 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:34:20 crc kubenswrapper[4903]: E1203 00:34:20.612808 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:34:29 crc kubenswrapper[4903]: I1203 00:34:29.316538 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:34:29 crc kubenswrapper[4903]: I1203 00:34:29.712151 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:34:29 crc kubenswrapper[4903]: I1203 00:34:29.732897 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:34:29 crc kubenswrapper[4903]: I1203 00:34:29.814961 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.011396 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.024085 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/extract/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.024501 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.227561 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.417994 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.430339 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.436926 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.618498 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.625563 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/extract/0.log" Dec 03 00:34:30 crc kubenswrapper[4903]: I1203 00:34:30.636788 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.049248 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.049401 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.051264 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.077431 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.240873 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.288068 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.296398 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/extract/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.417953 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.634683 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.653304 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.750568 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.845462 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:34:31 crc kubenswrapper[4903]: I1203 00:34:31.873201 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.125788 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.351106 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.451246 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.481705 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.633827 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/registry-server/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.674635 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.683059 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.869003 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cp8sg_b6ef141a-9183-423b-85e6-e7a02cc32267/marketplace-operator/0.log" Dec 03 00:34:32 crc kubenswrapper[4903]: I1203 00:34:32.939228 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.238624 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.250420 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.258396 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.460706 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.498793 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/registry-server/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.535613 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.693680 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/registry-server/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.704277 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.847857 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.868783 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:34:33 crc kubenswrapper[4903]: I1203 00:34:33.888116 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:34:34 crc kubenswrapper[4903]: I1203 00:34:34.032279 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:34:34 crc kubenswrapper[4903]: I1203 00:34:34.057880 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:34:34 crc kubenswrapper[4903]: I1203 00:34:34.613039 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:34:34 crc kubenswrapper[4903]: E1203 00:34:34.613523 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:34:34 crc kubenswrapper[4903]: I1203 00:34:34.703035 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/registry-server/0.log" Dec 03 00:34:46 crc kubenswrapper[4903]: I1203 00:34:46.612515 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:34:46 crc kubenswrapper[4903]: E1203 00:34:46.613581 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:34:48 crc kubenswrapper[4903]: I1203 00:34:48.171591 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-bk4xc_7dec0455-1e61-4cbc-893d-600ca1526f90/prometheus-operator/0.log" Dec 03 00:34:48 crc kubenswrapper[4903]: I1203 00:34:48.305364 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d68788f74-trl6v_9de21c8e-1da1-4105-83c9-c3a0d3fef062/prometheus-operator-admission-webhook/0.log" Dec 03 00:34:48 crc kubenswrapper[4903]: I1203 00:34:48.315551 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d68788f74-2nxbp_03107dd2-f5e5-4314-87ff-89c1f03811b2/prometheus-operator-admission-webhook/0.log" Dec 03 00:34:48 crc kubenswrapper[4903]: I1203 00:34:48.505119 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-4cm4x_6cc0aefd-91b2-432d-8564-ab955a89620a/operator/0.log" Dec 03 00:34:48 crc kubenswrapper[4903]: I1203 00:34:48.549639 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-4rrnv_c1730406-adf5-4f90-badf-6f40bec034eb/perses-operator/0.log" Dec 03 00:35:01 crc kubenswrapper[4903]: I1203 00:35:01.619161 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:35:01 crc kubenswrapper[4903]: E1203 00:35:01.619821 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:35:09 crc kubenswrapper[4903]: E1203 00:35:09.774743 4903 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:40386->38.102.83.39:43931: write tcp 38.102.83.39:40386->38.102.83.39:43931: write: broken pipe Dec 03 00:35:15 crc kubenswrapper[4903]: I1203 00:35:15.613754 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:35:15 crc kubenswrapper[4903]: E1203 00:35:15.614607 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:35:30 crc kubenswrapper[4903]: I1203 00:35:30.613085 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:35:30 crc kubenswrapper[4903]: E1203 00:35:30.614423 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:35:43 crc kubenswrapper[4903]: I1203 00:35:43.613083 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:35:43 crc kubenswrapper[4903]: E1203 00:35:43.613758 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:35:56 crc kubenswrapper[4903]: I1203 00:35:56.612439 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:35:56 crc kubenswrapper[4903]: E1203 00:35:56.613256 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:36:07 crc kubenswrapper[4903]: I1203 00:36:07.614452 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:36:07 crc kubenswrapper[4903]: E1203 00:36:07.615485 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:36:19 crc kubenswrapper[4903]: I1203 00:36:19.613202 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:36:19 crc kubenswrapper[4903]: E1203 00:36:19.614055 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:36:32 crc kubenswrapper[4903]: I1203 00:36:32.612257 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:36:32 crc kubenswrapper[4903]: E1203 00:36:32.613247 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.176522 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:36:43 crc kubenswrapper[4903]: E1203 00:36:43.177798 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc1c57b-ee2e-422d-9b28-225328734b9a" containerName="container-00" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.177814 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc1c57b-ee2e-422d-9b28-225328734b9a" containerName="container-00" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.178109 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc1c57b-ee2e-422d-9b28-225328734b9a" containerName="container-00" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.180103 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.189614 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.234319 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.234475 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.235192 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vzn\" (UniqueName: \"kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.337482 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vzn\" (UniqueName: \"kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.337624 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.337703 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.338342 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.339163 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.367540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vzn\" (UniqueName: \"kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn\") pod \"redhat-operators-frn5v\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:43 crc kubenswrapper[4903]: I1203 00:36:43.500939 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:44 crc kubenswrapper[4903]: I1203 00:36:44.080200 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:36:44 crc kubenswrapper[4903]: I1203 00:36:44.483067 4903 generic.go:334] "Generic (PLEG): container finished" podID="879abd1d-9fd1-469a-bc17-0718a2434526" containerID="f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0" exitCode=0 Dec 03 00:36:44 crc kubenswrapper[4903]: I1203 00:36:44.483161 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerDied","Data":"f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0"} Dec 03 00:36:44 crc kubenswrapper[4903]: I1203 00:36:44.483342 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerStarted","Data":"75eba4e7cd5b32b2fe30d31207b1c8908bd636f91c53e804f5333a3dc8976d65"} Dec 03 00:36:44 crc kubenswrapper[4903]: I1203 00:36:44.484899 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:36:45 crc kubenswrapper[4903]: I1203 00:36:45.493000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerStarted","Data":"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228"} Dec 03 00:36:46 crc kubenswrapper[4903]: I1203 00:36:46.613336 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:36:46 crc kubenswrapper[4903]: E1203 00:36:46.614098 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:36:48 crc kubenswrapper[4903]: I1203 00:36:48.542684 4903 generic.go:334] "Generic (PLEG): container finished" podID="879abd1d-9fd1-469a-bc17-0718a2434526" containerID="994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228" exitCode=0 Dec 03 00:36:48 crc kubenswrapper[4903]: I1203 00:36:48.542759 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerDied","Data":"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228"} Dec 03 00:36:49 crc kubenswrapper[4903]: I1203 00:36:49.556836 4903 generic.go:334] "Generic (PLEG): container finished" podID="4abb1095-900f-47b9-a0ae-b494f350a421" containerID="f07139e5a9ba9d97b8fab7f50301fd10c25f01b902cfb13f0c42df7856c74fa7" exitCode=0 Dec 03 00:36:49 crc kubenswrapper[4903]: I1203 00:36:49.556937 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jn8lr/must-gather-s66vc" event={"ID":"4abb1095-900f-47b9-a0ae-b494f350a421","Type":"ContainerDied","Data":"f07139e5a9ba9d97b8fab7f50301fd10c25f01b902cfb13f0c42df7856c74fa7"} Dec 03 00:36:49 crc kubenswrapper[4903]: I1203 00:36:49.558548 4903 scope.go:117] "RemoveContainer" containerID="f07139e5a9ba9d97b8fab7f50301fd10c25f01b902cfb13f0c42df7856c74fa7" Dec 03 00:36:50 crc kubenswrapper[4903]: I1203 00:36:50.470271 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jn8lr_must-gather-s66vc_4abb1095-900f-47b9-a0ae-b494f350a421/gather/0.log" Dec 03 00:36:50 crc kubenswrapper[4903]: I1203 00:36:50.577500 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerStarted","Data":"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39"} Dec 03 00:36:50 crc kubenswrapper[4903]: I1203 00:36:50.603202 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-frn5v" podStartSLOduration=2.681756869 podStartE2EDuration="7.603183031s" podCreationTimestamp="2025-12-03 00:36:43 +0000 UTC" firstStartedPulling="2025-12-03 00:36:44.484592517 +0000 UTC m=+5943.193146800" lastFinishedPulling="2025-12-03 00:36:49.406018679 +0000 UTC m=+5948.114572962" observedRunningTime="2025-12-03 00:36:50.600957488 +0000 UTC m=+5949.309511801" watchObservedRunningTime="2025-12-03 00:36:50.603183031 +0000 UTC m=+5949.311737324" Dec 03 00:36:53 crc kubenswrapper[4903]: I1203 00:36:53.501505 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:53 crc kubenswrapper[4903]: I1203 00:36:53.502498 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:36:54 crc kubenswrapper[4903]: I1203 00:36:54.552573 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frn5v" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="registry-server" probeResult="failure" output=< Dec 03 00:36:54 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:36:54 crc kubenswrapper[4903]: > Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.432911 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.435323 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.444354 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.520468 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.520550 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g98tt\" (UniqueName: \"kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.520636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.622299 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.622355 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g98tt\" (UniqueName: \"kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.622407 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.622967 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.623182 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.654406 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g98tt\" (UniqueName: \"kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt\") pod \"redhat-marketplace-hprnp\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:55 crc kubenswrapper[4903]: I1203 00:36:55.760009 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:36:56 crc kubenswrapper[4903]: I1203 00:36:56.262569 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:36:56 crc kubenswrapper[4903]: I1203 00:36:56.646395 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerID="ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb" exitCode=0 Dec 03 00:36:56 crc kubenswrapper[4903]: I1203 00:36:56.646520 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerDied","Data":"ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb"} Dec 03 00:36:56 crc kubenswrapper[4903]: I1203 00:36:56.646789 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerStarted","Data":"3887f6282fe47d5195a7777d0639bb54ddbecbc6e90c7a8ead15baf8a27cabca"} Dec 03 00:36:57 crc kubenswrapper[4903]: I1203 00:36:57.658278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerStarted","Data":"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794"} Dec 03 00:36:58 crc kubenswrapper[4903]: I1203 00:36:58.667979 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerID="43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794" exitCode=0 Dec 03 00:36:58 crc kubenswrapper[4903]: I1203 00:36:58.668030 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerDied","Data":"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794"} Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.498997 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jn8lr/must-gather-s66vc"] Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.499703 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jn8lr/must-gather-s66vc" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="copy" containerID="cri-o://645428a4e4f373a5fa721152898e146f5302428b4cd7a74f8be838d30562cd64" gracePeriod=2 Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.511422 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jn8lr/must-gather-s66vc"] Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.687619 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerStarted","Data":"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd"} Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.693225 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jn8lr_must-gather-s66vc_4abb1095-900f-47b9-a0ae-b494f350a421/copy/0.log" Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.694302 4903 generic.go:334] "Generic (PLEG): container finished" podID="4abb1095-900f-47b9-a0ae-b494f350a421" containerID="645428a4e4f373a5fa721152898e146f5302428b4cd7a74f8be838d30562cd64" exitCode=143 Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.721760 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hprnp" podStartSLOduration=2.177797532 podStartE2EDuration="4.721743197s" podCreationTimestamp="2025-12-03 00:36:55 +0000 UTC" firstStartedPulling="2025-12-03 00:36:56.648639091 +0000 UTC m=+5955.357193364" lastFinishedPulling="2025-12-03 00:36:59.192584736 +0000 UTC m=+5957.901139029" observedRunningTime="2025-12-03 00:36:59.706737884 +0000 UTC m=+5958.415292177" watchObservedRunningTime="2025-12-03 00:36:59.721743197 +0000 UTC m=+5958.430297490" Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.994562 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jn8lr_must-gather-s66vc_4abb1095-900f-47b9-a0ae-b494f350a421/copy/0.log" Dec 03 00:36:59 crc kubenswrapper[4903]: I1203 00:36:59.995769 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.122798 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9bv\" (UniqueName: \"kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv\") pod \"4abb1095-900f-47b9-a0ae-b494f350a421\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.122934 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output\") pod \"4abb1095-900f-47b9-a0ae-b494f350a421\" (UID: \"4abb1095-900f-47b9-a0ae-b494f350a421\") " Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.129874 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv" (OuterVolumeSpecName: "kube-api-access-fh9bv") pod "4abb1095-900f-47b9-a0ae-b494f350a421" (UID: "4abb1095-900f-47b9-a0ae-b494f350a421"). InnerVolumeSpecName "kube-api-access-fh9bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.225199 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9bv\" (UniqueName: \"kubernetes.io/projected/4abb1095-900f-47b9-a0ae-b494f350a421-kube-api-access-fh9bv\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.366772 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4abb1095-900f-47b9-a0ae-b494f350a421" (UID: "4abb1095-900f-47b9-a0ae-b494f350a421"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.428872 4903 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4abb1095-900f-47b9-a0ae-b494f350a421-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.614519 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:37:00 crc kubenswrapper[4903]: E1203 00:37:00.614819 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.703044 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jn8lr_must-gather-s66vc_4abb1095-900f-47b9-a0ae-b494f350a421/copy/0.log" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.703593 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jn8lr/must-gather-s66vc" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.703622 4903 scope.go:117] "RemoveContainer" containerID="645428a4e4f373a5fa721152898e146f5302428b4cd7a74f8be838d30562cd64" Dec 03 00:37:00 crc kubenswrapper[4903]: I1203 00:37:00.724093 4903 scope.go:117] "RemoveContainer" containerID="f07139e5a9ba9d97b8fab7f50301fd10c25f01b902cfb13f0c42df7856c74fa7" Dec 03 00:37:01 crc kubenswrapper[4903]: I1203 00:37:01.630076 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" path="/var/lib/kubelet/pods/4abb1095-900f-47b9-a0ae-b494f350a421/volumes" Dec 03 00:37:03 crc kubenswrapper[4903]: I1203 00:37:03.568114 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:37:03 crc kubenswrapper[4903]: I1203 00:37:03.628240 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:37:03 crc kubenswrapper[4903]: I1203 00:37:03.836634 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:37:04 crc kubenswrapper[4903]: I1203 00:37:04.736936 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-frn5v" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="registry-server" containerID="cri-o://48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39" gracePeriod=2 Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.211245 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.328119 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8vzn\" (UniqueName: \"kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn\") pod \"879abd1d-9fd1-469a-bc17-0718a2434526\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.328253 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content\") pod \"879abd1d-9fd1-469a-bc17-0718a2434526\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.328309 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities\") pod \"879abd1d-9fd1-469a-bc17-0718a2434526\" (UID: \"879abd1d-9fd1-469a-bc17-0718a2434526\") " Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.329520 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities" (OuterVolumeSpecName: "utilities") pod "879abd1d-9fd1-469a-bc17-0718a2434526" (UID: "879abd1d-9fd1-469a-bc17-0718a2434526"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.334815 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn" (OuterVolumeSpecName: "kube-api-access-m8vzn") pod "879abd1d-9fd1-469a-bc17-0718a2434526" (UID: "879abd1d-9fd1-469a-bc17-0718a2434526"). InnerVolumeSpecName "kube-api-access-m8vzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.430909 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8vzn\" (UniqueName: \"kubernetes.io/projected/879abd1d-9fd1-469a-bc17-0718a2434526-kube-api-access-m8vzn\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.430943 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.448954 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "879abd1d-9fd1-469a-bc17-0718a2434526" (UID: "879abd1d-9fd1-469a-bc17-0718a2434526"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.532510 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879abd1d-9fd1-469a-bc17-0718a2434526-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.752623 4903 generic.go:334] "Generic (PLEG): container finished" podID="879abd1d-9fd1-469a-bc17-0718a2434526" containerID="48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39" exitCode=0 Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.752722 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frn5v" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.752704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerDied","Data":"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39"} Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.752876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frn5v" event={"ID":"879abd1d-9fd1-469a-bc17-0718a2434526","Type":"ContainerDied","Data":"75eba4e7cd5b32b2fe30d31207b1c8908bd636f91c53e804f5333a3dc8976d65"} Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.752906 4903 scope.go:117] "RemoveContainer" containerID="48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.760744 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.761216 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.791501 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.792447 4903 scope.go:117] "RemoveContainer" containerID="994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.809964 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-frn5v"] Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.830942 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.834265 4903 scope.go:117] "RemoveContainer" containerID="f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.887670 4903 scope.go:117] "RemoveContainer" containerID="48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39" Dec 03 00:37:05 crc kubenswrapper[4903]: E1203 00:37:05.888327 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39\": container with ID starting with 48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39 not found: ID does not exist" containerID="48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.888359 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39"} err="failed to get container status \"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39\": rpc error: code = NotFound desc = could not find container \"48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39\": container with ID starting with 48e83f252a727e7511304ff021b62a6b97525bc684f742d401e87af64ae50d39 not found: ID does not exist" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.888380 4903 scope.go:117] "RemoveContainer" containerID="994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228" Dec 03 00:37:05 crc kubenswrapper[4903]: E1203 00:37:05.888841 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228\": container with ID starting with 994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228 not found: ID does not exist" containerID="994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.888887 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228"} err="failed to get container status \"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228\": rpc error: code = NotFound desc = could not find container \"994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228\": container with ID starting with 994b9d5e5520008a2dfabc494c694ec54ab89d4921ba0d5677813fce338e1228 not found: ID does not exist" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.888921 4903 scope.go:117] "RemoveContainer" containerID="f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0" Dec 03 00:37:05 crc kubenswrapper[4903]: E1203 00:37:05.889251 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0\": container with ID starting with f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0 not found: ID does not exist" containerID="f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0" Dec 03 00:37:05 crc kubenswrapper[4903]: I1203 00:37:05.889292 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0"} err="failed to get container status \"f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0\": rpc error: code = NotFound desc = could not find container \"f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0\": container with ID starting with f2b00a87520bd40d1c3a536658abff8b77ea2198dd17b3b3af66436f41fa73c0 not found: ID does not exist" Dec 03 00:37:06 crc kubenswrapper[4903]: I1203 00:37:06.815208 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:07 crc kubenswrapper[4903]: I1203 00:37:07.273053 4903 scope.go:117] "RemoveContainer" containerID="97c99e0d6a3a0a590da1f76424c91cafbc2e2796bbac40905a17c3667134560f" Dec 03 00:37:07 crc kubenswrapper[4903]: I1203 00:37:07.309111 4903 scope.go:117] "RemoveContainer" containerID="aaab28de4e0cf496f2cceafc7ef2627a651b224f890172c82bb99e88273c274a" Dec 03 00:37:07 crc kubenswrapper[4903]: I1203 00:37:07.352960 4903 scope.go:117] "RemoveContainer" containerID="61f2f233e6b2850825ef864e7f07bd28672a80f8199510b4b19b927da80efb09" Dec 03 00:37:07 crc kubenswrapper[4903]: I1203 00:37:07.623996 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" path="/var/lib/kubelet/pods/879abd1d-9fd1-469a-bc17-0718a2434526/volumes" Dec 03 00:37:08 crc kubenswrapper[4903]: I1203 00:37:08.221277 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:37:09 crc kubenswrapper[4903]: I1203 00:37:09.806728 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hprnp" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="registry-server" containerID="cri-o://77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd" gracePeriod=2 Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.292176 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.443528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g98tt\" (UniqueName: \"kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt\") pod \"a7d4148d-4936-498a-84c2-de1faf62e6d6\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.443611 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities\") pod \"a7d4148d-4936-498a-84c2-de1faf62e6d6\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.443763 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content\") pod \"a7d4148d-4936-498a-84c2-de1faf62e6d6\" (UID: \"a7d4148d-4936-498a-84c2-de1faf62e6d6\") " Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.444448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities" (OuterVolumeSpecName: "utilities") pod "a7d4148d-4936-498a-84c2-de1faf62e6d6" (UID: "a7d4148d-4936-498a-84c2-de1faf62e6d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.444610 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.452218 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt" (OuterVolumeSpecName: "kube-api-access-g98tt") pod "a7d4148d-4936-498a-84c2-de1faf62e6d6" (UID: "a7d4148d-4936-498a-84c2-de1faf62e6d6"). InnerVolumeSpecName "kube-api-access-g98tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.465466 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7d4148d-4936-498a-84c2-de1faf62e6d6" (UID: "a7d4148d-4936-498a-84c2-de1faf62e6d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.547139 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g98tt\" (UniqueName: \"kubernetes.io/projected/a7d4148d-4936-498a-84c2-de1faf62e6d6-kube-api-access-g98tt\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.547188 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7d4148d-4936-498a-84c2-de1faf62e6d6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.822288 4903 generic.go:334] "Generic (PLEG): container finished" podID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerID="77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd" exitCode=0 Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.822352 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnp" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.822373 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerDied","Data":"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd"} Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.822786 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnp" event={"ID":"a7d4148d-4936-498a-84c2-de1faf62e6d6","Type":"ContainerDied","Data":"3887f6282fe47d5195a7777d0639bb54ddbecbc6e90c7a8ead15baf8a27cabca"} Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.822820 4903 scope.go:117] "RemoveContainer" containerID="77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.851481 4903 scope.go:117] "RemoveContainer" containerID="43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.881065 4903 scope.go:117] "RemoveContainer" containerID="ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.891509 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.903495 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnp"] Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.943474 4903 scope.go:117] "RemoveContainer" containerID="77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd" Dec 03 00:37:10 crc kubenswrapper[4903]: E1203 00:37:10.943999 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd\": container with ID starting with 77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd not found: ID does not exist" containerID="77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.944039 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd"} err="failed to get container status \"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd\": rpc error: code = NotFound desc = could not find container \"77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd\": container with ID starting with 77514a92f19381a3f79b95c8cfda8e1bd8ed28c6acfd1b18ee6abd6267c99cbd not found: ID does not exist" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.944067 4903 scope.go:117] "RemoveContainer" containerID="43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794" Dec 03 00:37:10 crc kubenswrapper[4903]: E1203 00:37:10.944434 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794\": container with ID starting with 43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794 not found: ID does not exist" containerID="43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.944466 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794"} err="failed to get container status \"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794\": rpc error: code = NotFound desc = could not find container \"43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794\": container with ID starting with 43521ab55eacc167e3500738ad63fb65f6d5a99dfac0f68064ddc11dbf279794 not found: ID does not exist" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.944485 4903 scope.go:117] "RemoveContainer" containerID="ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb" Dec 03 00:37:10 crc kubenswrapper[4903]: E1203 00:37:10.944926 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb\": container with ID starting with ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb not found: ID does not exist" containerID="ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb" Dec 03 00:37:10 crc kubenswrapper[4903]: I1203 00:37:10.944975 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb"} err="failed to get container status \"ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb\": rpc error: code = NotFound desc = could not find container \"ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb\": container with ID starting with ccebe01365fffef3cfa45bcfacd85d877d896168462bc56caf95c22138d6a7bb not found: ID does not exist" Dec 03 00:37:11 crc kubenswrapper[4903]: I1203 00:37:11.631240 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" path="/var/lib/kubelet/pods/a7d4148d-4936-498a-84c2-de1faf62e6d6/volumes" Dec 03 00:37:15 crc kubenswrapper[4903]: I1203 00:37:15.612023 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:37:15 crc kubenswrapper[4903]: E1203 00:37:15.612808 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:37:27 crc kubenswrapper[4903]: I1203 00:37:27.613446 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:37:28 crc kubenswrapper[4903]: I1203 00:37:28.036070 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb"} Dec 03 00:38:07 crc kubenswrapper[4903]: I1203 00:38:07.461978 4903 scope.go:117] "RemoveContainer" containerID="80681280341d4659b0a4cc9a42939f0db03d19f5dddf37871b5a6c9b539e976a" Dec 03 00:38:34 crc kubenswrapper[4903]: I1203 00:38:34.022675 4903 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.036787151s: [/var/lib/containers/storage/overlay/dbbd845828124c588fac6c7ce734e4da1f4e670e828a370ec3ffc04d86508b0f/diff ]; will not log again for this container unless duration exceeds 2s Dec 03 00:39:53 crc kubenswrapper[4903]: I1203 00:39:53.070223 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:39:53 crc kubenswrapper[4903]: I1203 00:39:53.071114 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.532293 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vn4gt/must-gather-9ntbg"] Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533439 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533451 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533461 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="extract-utilities" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533469 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="extract-utilities" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533483 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="copy" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533489 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="copy" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533507 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="extract-content" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533513 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="extract-content" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533535 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="extract-utilities" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533542 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="extract-utilities" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533566 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533572 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533581 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="extract-content" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533587 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="extract-content" Dec 03 00:40:13 crc kubenswrapper[4903]: E1203 00:40:13.533616 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="gather" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533622 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="gather" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.533980 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="copy" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.534011 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abb1095-900f-47b9-a0ae-b494f350a421" containerName="gather" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.534027 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d4148d-4936-498a-84c2-de1faf62e6d6" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.534035 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="879abd1d-9fd1-469a-bc17-0718a2434526" containerName="registry-server" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.535580 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.540537 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vn4gt"/"kube-root-ca.crt" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.540911 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vn4gt"/"openshift-service-ca.crt" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.573752 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vn4gt/must-gather-9ntbg"] Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.635110 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.635172 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb92v\" (UniqueName: \"kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.737743 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.737805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb92v\" (UniqueName: \"kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.738843 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.774913 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb92v\" (UniqueName: \"kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v\") pod \"must-gather-9ntbg\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:13 crc kubenswrapper[4903]: I1203 00:40:13.877608 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:40:14 crc kubenswrapper[4903]: I1203 00:40:14.350240 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vn4gt/must-gather-9ntbg"] Dec 03 00:40:15 crc kubenswrapper[4903]: I1203 00:40:15.178961 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" event={"ID":"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc","Type":"ContainerStarted","Data":"a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401"} Dec 03 00:40:15 crc kubenswrapper[4903]: I1203 00:40:15.179296 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" event={"ID":"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc","Type":"ContainerStarted","Data":"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab"} Dec 03 00:40:15 crc kubenswrapper[4903]: I1203 00:40:15.179310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" event={"ID":"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc","Type":"ContainerStarted","Data":"21caa23b4410c45888b865978bb0fa6741b00a951281a7d89ee022bcd6dc1183"} Dec 03 00:40:15 crc kubenswrapper[4903]: I1203 00:40:15.197846 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" podStartSLOduration=2.197812386 podStartE2EDuration="2.197812386s" podCreationTimestamp="2025-12-03 00:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:40:15.195718875 +0000 UTC m=+6153.904273178" watchObservedRunningTime="2025-12-03 00:40:15.197812386 +0000 UTC m=+6153.906366709" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.671012 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-gxspw"] Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.672707 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.675143 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vn4gt"/"default-dockercfg-m5bkj" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.737770 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.738061 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vvh5\" (UniqueName: \"kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.839671 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vvh5\" (UniqueName: \"kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.839754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.839867 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.857699 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vvh5\" (UniqueName: \"kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5\") pod \"crc-debug-gxspw\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:18 crc kubenswrapper[4903]: I1203 00:40:18.994054 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:40:19 crc kubenswrapper[4903]: W1203 00:40:19.026324 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21527967_bfaf_45db_99a4_3adc0972d5d9.slice/crio-9b22be30a58803b0b1ccbdd723ce847aecae016814ed2fb4cf8ac495f05d8a41 WatchSource:0}: Error finding container 9b22be30a58803b0b1ccbdd723ce847aecae016814ed2fb4cf8ac495f05d8a41: Status 404 returned error can't find the container with id 9b22be30a58803b0b1ccbdd723ce847aecae016814ed2fb4cf8ac495f05d8a41 Dec 03 00:40:19 crc kubenswrapper[4903]: I1203 00:40:19.219216 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" event={"ID":"21527967-bfaf-45db-99a4-3adc0972d5d9","Type":"ContainerStarted","Data":"9b22be30a58803b0b1ccbdd723ce847aecae016814ed2fb4cf8ac495f05d8a41"} Dec 03 00:40:20 crc kubenswrapper[4903]: I1203 00:40:20.229050 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" event={"ID":"21527967-bfaf-45db-99a4-3adc0972d5d9","Type":"ContainerStarted","Data":"681f4eb1eea154be6a0ff032a8e793cb6ad05b13cf5003349b09b121a2da46e0"} Dec 03 00:40:20 crc kubenswrapper[4903]: I1203 00:40:20.249089 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" podStartSLOduration=2.249070942 podStartE2EDuration="2.249070942s" podCreationTimestamp="2025-12-03 00:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:40:20.2394615 +0000 UTC m=+6158.948015783" watchObservedRunningTime="2025-12-03 00:40:20.249070942 +0000 UTC m=+6158.957625215" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.069576 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.070233 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.444078 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.446188 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.464706 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.534411 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6jn\" (UniqueName: \"kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.534545 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.534568 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.641929 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.641972 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.642111 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6jn\" (UniqueName: \"kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.642604 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.643051 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.664711 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6jn\" (UniqueName: \"kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn\") pod \"certified-operators-bd97b\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:23 crc kubenswrapper[4903]: I1203 00:40:23.783564 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:24 crc kubenswrapper[4903]: I1203 00:40:24.448089 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:25 crc kubenswrapper[4903]: I1203 00:40:25.285382 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerStarted","Data":"655b7bb1bd7f7eb31282a015759d85ed6f595ab2bd4ce149fce667c7a70d2e76"} Dec 03 00:40:26 crc kubenswrapper[4903]: I1203 00:40:26.295292 4903 generic.go:334] "Generic (PLEG): container finished" podID="98965d0c-bd02-47a4-8a64-16e7de035279" containerID="9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d" exitCode=0 Dec 03 00:40:26 crc kubenswrapper[4903]: I1203 00:40:26.295330 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerDied","Data":"9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d"} Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.238718 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.241802 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.260064 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.260182 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.260206 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmk7s\" (UniqueName: \"kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.264670 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.320574 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerStarted","Data":"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2"} Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.363114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.363445 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.363526 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmk7s\" (UniqueName: \"kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.367872 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.368011 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.385702 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmk7s\" (UniqueName: \"kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s\") pod \"community-operators-slb7c\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:27 crc kubenswrapper[4903]: I1203 00:40:27.567772 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:28 crc kubenswrapper[4903]: I1203 00:40:28.177260 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:28 crc kubenswrapper[4903]: I1203 00:40:28.332437 4903 generic.go:334] "Generic (PLEG): container finished" podID="98965d0c-bd02-47a4-8a64-16e7de035279" containerID="ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2" exitCode=0 Dec 03 00:40:28 crc kubenswrapper[4903]: I1203 00:40:28.332468 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerDied","Data":"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2"} Dec 03 00:40:28 crc kubenswrapper[4903]: W1203 00:40:28.931848 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1 WatchSource:0}: Error finding container 02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1: Status 404 returned error can't find the container with id 02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1 Dec 03 00:40:29 crc kubenswrapper[4903]: I1203 00:40:29.344960 4903 generic.go:334] "Generic (PLEG): container finished" podID="f29bfd7f-77aa-497c-b680-4458722cac38" containerID="3fba698e82a8f82e77bff5e33708cc2da4c1d6000ffcacb2ebd171c60a6c3aea" exitCode=0 Dec 03 00:40:29 crc kubenswrapper[4903]: I1203 00:40:29.345062 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerDied","Data":"3fba698e82a8f82e77bff5e33708cc2da4c1d6000ffcacb2ebd171c60a6c3aea"} Dec 03 00:40:29 crc kubenswrapper[4903]: I1203 00:40:29.345561 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerStarted","Data":"02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1"} Dec 03 00:40:30 crc kubenswrapper[4903]: I1203 00:40:30.357929 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerStarted","Data":"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b"} Dec 03 00:40:30 crc kubenswrapper[4903]: I1203 00:40:30.360069 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerStarted","Data":"0909e3cf1eae474ec61b1f3d7ded8d0fdbfaf27836f9fbda25c2831234ce3131"} Dec 03 00:40:30 crc kubenswrapper[4903]: I1203 00:40:30.378243 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bd97b" podStartSLOduration=4.308323786 podStartE2EDuration="7.378221675s" podCreationTimestamp="2025-12-03 00:40:23 +0000 UTC" firstStartedPulling="2025-12-03 00:40:26.297078612 +0000 UTC m=+6165.005632895" lastFinishedPulling="2025-12-03 00:40:29.366976501 +0000 UTC m=+6168.075530784" observedRunningTime="2025-12-03 00:40:30.374958617 +0000 UTC m=+6169.083512910" watchObservedRunningTime="2025-12-03 00:40:30.378221675 +0000 UTC m=+6169.086775958" Dec 03 00:40:31 crc kubenswrapper[4903]: I1203 00:40:31.372367 4903 generic.go:334] "Generic (PLEG): container finished" podID="f29bfd7f-77aa-497c-b680-4458722cac38" containerID="0909e3cf1eae474ec61b1f3d7ded8d0fdbfaf27836f9fbda25c2831234ce3131" exitCode=0 Dec 03 00:40:31 crc kubenswrapper[4903]: I1203 00:40:31.372442 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerDied","Data":"0909e3cf1eae474ec61b1f3d7ded8d0fdbfaf27836f9fbda25c2831234ce3131"} Dec 03 00:40:33 crc kubenswrapper[4903]: I1203 00:40:33.399393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerStarted","Data":"97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1"} Dec 03 00:40:33 crc kubenswrapper[4903]: I1203 00:40:33.418845 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-slb7c" podStartSLOduration=3.556822876 podStartE2EDuration="6.418827977s" podCreationTimestamp="2025-12-03 00:40:27 +0000 UTC" firstStartedPulling="2025-12-03 00:40:29.346903575 +0000 UTC m=+6168.055457858" lastFinishedPulling="2025-12-03 00:40:32.208908676 +0000 UTC m=+6170.917462959" observedRunningTime="2025-12-03 00:40:33.416613413 +0000 UTC m=+6172.125167706" watchObservedRunningTime="2025-12-03 00:40:33.418827977 +0000 UTC m=+6172.127382260" Dec 03 00:40:33 crc kubenswrapper[4903]: I1203 00:40:33.783973 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:33 crc kubenswrapper[4903]: I1203 00:40:33.784289 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:33 crc kubenswrapper[4903]: I1203 00:40:33.832837 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:34 crc kubenswrapper[4903]: I1203 00:40:34.460869 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:36 crc kubenswrapper[4903]: I1203 00:40:36.604559 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:36 crc kubenswrapper[4903]: I1203 00:40:36.605095 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bd97b" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="registry-server" containerID="cri-o://3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b" gracePeriod=2 Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.111109 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.277419 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities\") pod \"98965d0c-bd02-47a4-8a64-16e7de035279\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.277632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content\") pod \"98965d0c-bd02-47a4-8a64-16e7de035279\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.277738 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl6jn\" (UniqueName: \"kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn\") pod \"98965d0c-bd02-47a4-8a64-16e7de035279\" (UID: \"98965d0c-bd02-47a4-8a64-16e7de035279\") " Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.280779 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities" (OuterVolumeSpecName: "utilities") pod "98965d0c-bd02-47a4-8a64-16e7de035279" (UID: "98965d0c-bd02-47a4-8a64-16e7de035279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.296813 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn" (OuterVolumeSpecName: "kube-api-access-sl6jn") pod "98965d0c-bd02-47a4-8a64-16e7de035279" (UID: "98965d0c-bd02-47a4-8a64-16e7de035279"). InnerVolumeSpecName "kube-api-access-sl6jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.333985 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98965d0c-bd02-47a4-8a64-16e7de035279" (UID: "98965d0c-bd02-47a4-8a64-16e7de035279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.379697 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl6jn\" (UniqueName: \"kubernetes.io/projected/98965d0c-bd02-47a4-8a64-16e7de035279-kube-api-access-sl6jn\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.379747 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.379761 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98965d0c-bd02-47a4-8a64-16e7de035279-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.441598 4903 generic.go:334] "Generic (PLEG): container finished" podID="98965d0c-bd02-47a4-8a64-16e7de035279" containerID="3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b" exitCode=0 Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.441668 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerDied","Data":"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b"} Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.441691 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd97b" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.441713 4903 scope.go:117] "RemoveContainer" containerID="3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.441700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd97b" event={"ID":"98965d0c-bd02-47a4-8a64-16e7de035279","Type":"ContainerDied","Data":"655b7bb1bd7f7eb31282a015759d85ed6f595ab2bd4ce149fce667c7a70d2e76"} Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.469412 4903 scope.go:117] "RemoveContainer" containerID="ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.478707 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.488919 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bd97b"] Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.497412 4903 scope.go:117] "RemoveContainer" containerID="9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.568252 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.568496 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.579982 4903 scope.go:117] "RemoveContainer" containerID="3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b" Dec 03 00:40:37 crc kubenswrapper[4903]: E1203 00:40:37.580498 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b\": container with ID starting with 3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b not found: ID does not exist" containerID="3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.580557 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b"} err="failed to get container status \"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b\": rpc error: code = NotFound desc = could not find container \"3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b\": container with ID starting with 3bb34c0be634d132a67af54301f45facb76edddd68462b81c4460c3e27aace1b not found: ID does not exist" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.580610 4903 scope.go:117] "RemoveContainer" containerID="ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2" Dec 03 00:40:37 crc kubenswrapper[4903]: E1203 00:40:37.580947 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2\": container with ID starting with ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2 not found: ID does not exist" containerID="ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.580989 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2"} err="failed to get container status \"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2\": rpc error: code = NotFound desc = could not find container \"ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2\": container with ID starting with ab47e3266b7921f21ca10a1aa0c21517cfa48e92754c78c6892575ecec05f8c2 not found: ID does not exist" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.581015 4903 scope.go:117] "RemoveContainer" containerID="9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d" Dec 03 00:40:37 crc kubenswrapper[4903]: E1203 00:40:37.581275 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d\": container with ID starting with 9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d not found: ID does not exist" containerID="9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.581302 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d"} err="failed to get container status \"9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d\": rpc error: code = NotFound desc = could not find container \"9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d\": container with ID starting with 9e5a59e2c8831c48443ac50ed67e8b89d719421a638b40b836ef604d65069b5d not found: ID does not exist" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.637024 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" path="/var/lib/kubelet/pods/98965d0c-bd02-47a4-8a64-16e7de035279/volumes" Dec 03 00:40:37 crc kubenswrapper[4903]: I1203 00:40:37.638113 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:38 crc kubenswrapper[4903]: I1203 00:40:38.510480 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:40 crc kubenswrapper[4903]: I1203 00:40:40.012332 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:41 crc kubenswrapper[4903]: I1203 00:40:41.478246 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-slb7c" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="registry-server" containerID="cri-o://97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1" gracePeriod=2 Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.491335 4903 generic.go:334] "Generic (PLEG): container finished" podID="f29bfd7f-77aa-497c-b680-4458722cac38" containerID="97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1" exitCode=0 Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.491412 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerDied","Data":"97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1"} Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.491674 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slb7c" event={"ID":"f29bfd7f-77aa-497c-b680-4458722cac38","Type":"ContainerDied","Data":"02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1"} Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.491689 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02717bb85b44acf2c0c2f61eb0f040a93acbaf8ad0d07adcb212ac7023ab9ae1" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.528909 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:42 crc kubenswrapper[4903]: E1203 00:40:42.571885 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.719547 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities\") pod \"f29bfd7f-77aa-497c-b680-4458722cac38\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.719913 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmk7s\" (UniqueName: \"kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s\") pod \"f29bfd7f-77aa-497c-b680-4458722cac38\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.719995 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content\") pod \"f29bfd7f-77aa-497c-b680-4458722cac38\" (UID: \"f29bfd7f-77aa-497c-b680-4458722cac38\") " Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.720807 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities" (OuterVolumeSpecName: "utilities") pod "f29bfd7f-77aa-497c-b680-4458722cac38" (UID: "f29bfd7f-77aa-497c-b680-4458722cac38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.727847 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s" (OuterVolumeSpecName: "kube-api-access-kmk7s") pod "f29bfd7f-77aa-497c-b680-4458722cac38" (UID: "f29bfd7f-77aa-497c-b680-4458722cac38"). InnerVolumeSpecName "kube-api-access-kmk7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.769935 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f29bfd7f-77aa-497c-b680-4458722cac38" (UID: "f29bfd7f-77aa-497c-b680-4458722cac38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.822860 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.822896 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmk7s\" (UniqueName: \"kubernetes.io/projected/f29bfd7f-77aa-497c-b680-4458722cac38-kube-api-access-kmk7s\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:42 crc kubenswrapper[4903]: I1203 00:40:42.822905 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29bfd7f-77aa-497c-b680-4458722cac38-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:40:43 crc kubenswrapper[4903]: I1203 00:40:43.500962 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slb7c" Dec 03 00:40:43 crc kubenswrapper[4903]: I1203 00:40:43.530357 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:43 crc kubenswrapper[4903]: I1203 00:40:43.546881 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-slb7c"] Dec 03 00:40:43 crc kubenswrapper[4903]: I1203 00:40:43.623871 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" path="/var/lib/kubelet/pods/f29bfd7f-77aa-497c-b680-4458722cac38/volumes" Dec 03 00:40:52 crc kubenswrapper[4903]: E1203 00:40:52.915285 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.071807 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.071903 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.072169 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.073419 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.073490 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb" gracePeriod=600 Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.622732 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb" exitCode=0 Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.623022 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb"} Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.623044 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3"} Dec 03 00:40:53 crc kubenswrapper[4903]: I1203 00:40:53.623061 4903 scope.go:117] "RemoveContainer" containerID="6509c99647fa7a0fbc550beba939e42d4c77a4297077ce0b695625e09f2960ff" Dec 03 00:41:00 crc kubenswrapper[4903]: I1203 00:41:00.698510 4903 generic.go:334] "Generic (PLEG): container finished" podID="21527967-bfaf-45db-99a4-3adc0972d5d9" containerID="681f4eb1eea154be6a0ff032a8e793cb6ad05b13cf5003349b09b121a2da46e0" exitCode=0 Dec 03 00:41:00 crc kubenswrapper[4903]: I1203 00:41:00.698957 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" event={"ID":"21527967-bfaf-45db-99a4-3adc0972d5d9","Type":"ContainerDied","Data":"681f4eb1eea154be6a0ff032a8e793cb6ad05b13cf5003349b09b121a2da46e0"} Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.831920 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.859957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vvh5\" (UniqueName: \"kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5\") pod \"21527967-bfaf-45db-99a4-3adc0972d5d9\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.860015 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host\") pod \"21527967-bfaf-45db-99a4-3adc0972d5d9\" (UID: \"21527967-bfaf-45db-99a4-3adc0972d5d9\") " Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.860537 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host" (OuterVolumeSpecName: "host") pod "21527967-bfaf-45db-99a4-3adc0972d5d9" (UID: "21527967-bfaf-45db-99a4-3adc0972d5d9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.867462 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5" (OuterVolumeSpecName: "kube-api-access-7vvh5") pod "21527967-bfaf-45db-99a4-3adc0972d5d9" (UID: "21527967-bfaf-45db-99a4-3adc0972d5d9"). InnerVolumeSpecName "kube-api-access-7vvh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.874160 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-gxspw"] Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.885375 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-gxspw"] Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.961627 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vvh5\" (UniqueName: \"kubernetes.io/projected/21527967-bfaf-45db-99a4-3adc0972d5d9-kube-api-access-7vvh5\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:01 crc kubenswrapper[4903]: I1203 00:41:01.961682 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21527967-bfaf-45db-99a4-3adc0972d5d9-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:02 crc kubenswrapper[4903]: I1203 00:41:02.721331 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b22be30a58803b0b1ccbdd723ce847aecae016814ed2fb4cf8ac495f05d8a41" Dec 03 00:41:02 crc kubenswrapper[4903]: I1203 00:41:02.721381 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-gxspw" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089280 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-qlxwl"] Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089700 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21527967-bfaf-45db-99a4-3adc0972d5d9" containerName="container-00" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089712 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="21527967-bfaf-45db-99a4-3adc0972d5d9" containerName="container-00" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089728 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089734 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089753 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="extract-utilities" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089759 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="extract-utilities" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089770 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="extract-content" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089776 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="extract-content" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089792 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089798 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089811 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="extract-content" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089816 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="extract-content" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.089842 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="extract-utilities" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.089848 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="extract-utilities" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.090073 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29bfd7f-77aa-497c-b680-4458722cac38" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.090089 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="98965d0c-bd02-47a4-8a64-16e7de035279" containerName="registry-server" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.090105 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="21527967-bfaf-45db-99a4-3adc0972d5d9" containerName="container-00" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.090749 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.092603 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vn4gt"/"default-dockercfg-m5bkj" Dec 03 00:41:03 crc kubenswrapper[4903]: E1203 00:41:03.209489 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.285363 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddrpx\" (UniqueName: \"kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.285912 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.387495 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddrpx\" (UniqueName: \"kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.387613 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.387712 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.414253 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddrpx\" (UniqueName: \"kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx\") pod \"crc-debug-qlxwl\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.623385 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21527967-bfaf-45db-99a4-3adc0972d5d9" path="/var/lib/kubelet/pods/21527967-bfaf-45db-99a4-3adc0972d5d9/volumes" Dec 03 00:41:03 crc kubenswrapper[4903]: I1203 00:41:03.707858 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:04 crc kubenswrapper[4903]: I1203 00:41:04.746772 4903 generic.go:334] "Generic (PLEG): container finished" podID="05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" containerID="307e5df14806344fa932b63f225e6e76a7b50f6f80823c3a8373398108a666db" exitCode=0 Dec 03 00:41:04 crc kubenswrapper[4903]: I1203 00:41:04.746818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" event={"ID":"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e","Type":"ContainerDied","Data":"307e5df14806344fa932b63f225e6e76a7b50f6f80823c3a8373398108a666db"} Dec 03 00:41:04 crc kubenswrapper[4903]: I1203 00:41:04.747473 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" event={"ID":"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e","Type":"ContainerStarted","Data":"db75c642cc027890e8dcde2a08eb2ef5384630149437a04e9680920fabf4b0c1"} Dec 03 00:41:05 crc kubenswrapper[4903]: I1203 00:41:05.888160 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.036529 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host\") pod \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.036605 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host" (OuterVolumeSpecName: "host") pod "05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" (UID: "05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.036620 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddrpx\" (UniqueName: \"kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx\") pod \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\" (UID: \"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e\") " Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.037143 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.045142 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx" (OuterVolumeSpecName: "kube-api-access-ddrpx") pod "05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" (UID: "05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e"). InnerVolumeSpecName "kube-api-access-ddrpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.138601 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddrpx\" (UniqueName: \"kubernetes.io/projected/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e-kube-api-access-ddrpx\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.766428 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" event={"ID":"05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e","Type":"ContainerDied","Data":"db75c642cc027890e8dcde2a08eb2ef5384630149437a04e9680920fabf4b0c1"} Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.766732 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db75c642cc027890e8dcde2a08eb2ef5384630149437a04e9680920fabf4b0c1" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.766478 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-qlxwl" Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.964816 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-qlxwl"] Dec 03 00:41:06 crc kubenswrapper[4903]: I1203 00:41:06.976613 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-qlxwl"] Dec 03 00:41:07 crc kubenswrapper[4903]: I1203 00:41:07.622377 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" path="/var/lib/kubelet/pods/05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e/volumes" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.212459 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-4rjlh"] Dec 03 00:41:08 crc kubenswrapper[4903]: E1203 00:41:08.212911 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" containerName="container-00" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.212923 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" containerName="container-00" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.213296 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a43b3e-ede8-4d5b-b4c4-ffff9cb2527e" containerName="container-00" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.214065 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.218284 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vn4gt"/"default-dockercfg-m5bkj" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.290318 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.290434 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbpd\" (UniqueName: \"kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.393266 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.393397 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbpd\" (UniqueName: \"kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.393440 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.418488 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbpd\" (UniqueName: \"kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd\") pod \"crc-debug-4rjlh\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.536709 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:08 crc kubenswrapper[4903]: W1203 00:41:08.578264 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e6d5fd_32fa_4711_acf0_c70da688cac4.slice/crio-94ce03914a4d0ab35eecc9f9959f1df29f3886d98fa8909aefae87af9ad0e003 WatchSource:0}: Error finding container 94ce03914a4d0ab35eecc9f9959f1df29f3886d98fa8909aefae87af9ad0e003: Status 404 returned error can't find the container with id 94ce03914a4d0ab35eecc9f9959f1df29f3886d98fa8909aefae87af9ad0e003 Dec 03 00:41:08 crc kubenswrapper[4903]: I1203 00:41:08.789990 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" event={"ID":"e1e6d5fd-32fa-4711-acf0-c70da688cac4","Type":"ContainerStarted","Data":"94ce03914a4d0ab35eecc9f9959f1df29f3886d98fa8909aefae87af9ad0e003"} Dec 03 00:41:09 crc kubenswrapper[4903]: I1203 00:41:09.801738 4903 generic.go:334] "Generic (PLEG): container finished" podID="e1e6d5fd-32fa-4711-acf0-c70da688cac4" containerID="ff740985815559d481e9e4dbd8d20afffc5d502c624f98952e087acd166e0bdc" exitCode=0 Dec 03 00:41:09 crc kubenswrapper[4903]: I1203 00:41:09.801789 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" event={"ID":"e1e6d5fd-32fa-4711-acf0-c70da688cac4","Type":"ContainerDied","Data":"ff740985815559d481e9e4dbd8d20afffc5d502c624f98952e087acd166e0bdc"} Dec 03 00:41:09 crc kubenswrapper[4903]: I1203 00:41:09.855607 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-4rjlh"] Dec 03 00:41:09 crc kubenswrapper[4903]: I1203 00:41:09.867451 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vn4gt/crc-debug-4rjlh"] Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.918975 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.970310 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host\") pod \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.970613 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host" (OuterVolumeSpecName: "host") pod "e1e6d5fd-32fa-4711-acf0-c70da688cac4" (UID: "e1e6d5fd-32fa-4711-acf0-c70da688cac4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.970976 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfbpd\" (UniqueName: \"kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd\") pod \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\" (UID: \"e1e6d5fd-32fa-4711-acf0-c70da688cac4\") " Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.973205 4903 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1e6d5fd-32fa-4711-acf0-c70da688cac4-host\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:10 crc kubenswrapper[4903]: I1203 00:41:10.978981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd" (OuterVolumeSpecName: "kube-api-access-jfbpd") pod "e1e6d5fd-32fa-4711-acf0-c70da688cac4" (UID: "e1e6d5fd-32fa-4711-acf0-c70da688cac4"). InnerVolumeSpecName "kube-api-access-jfbpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:41:11 crc kubenswrapper[4903]: I1203 00:41:11.075051 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfbpd\" (UniqueName: \"kubernetes.io/projected/e1e6d5fd-32fa-4711-acf0-c70da688cac4-kube-api-access-jfbpd\") on node \"crc\" DevicePath \"\"" Dec 03 00:41:11 crc kubenswrapper[4903]: I1203 00:41:11.626457 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e6d5fd-32fa-4711-acf0-c70da688cac4" path="/var/lib/kubelet/pods/e1e6d5fd-32fa-4711-acf0-c70da688cac4/volumes" Dec 03 00:41:11 crc kubenswrapper[4903]: I1203 00:41:11.822690 4903 scope.go:117] "RemoveContainer" containerID="ff740985815559d481e9e4dbd8d20afffc5d502c624f98952e087acd166e0bdc" Dec 03 00:41:11 crc kubenswrapper[4903]: I1203 00:41:11.822853 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/crc-debug-4rjlh" Dec 03 00:41:13 crc kubenswrapper[4903]: E1203 00:41:13.471557 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:41:23 crc kubenswrapper[4903]: E1203 00:41:23.740612 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:41:34 crc kubenswrapper[4903]: E1203 00:41:34.020947 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29bfd7f_77aa_497c_b680_4458722cac38.slice/crio-97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 00:41:41 crc kubenswrapper[4903]: E1203 00:41:41.706518 4903 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_community-operators-slb7c_f29bfd7f-77aa-497c-b680-4458722cac38/registry-server/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_community-operators-slb7c_f29bfd7f-77aa-497c-b680-4458722cac38/registry-server/0.log: no such file or directory Dec 03 00:41:46 crc kubenswrapper[4903]: I1203 00:41:46.977204 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78cfc5fdf8-p9576_1a80c66a-4cfd-44a2-a5e4-5a9297e63f29/barbican-api/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.070230 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78cfc5fdf8-p9576_1a80c66a-4cfd-44a2-a5e4-5a9297e63f29/barbican-api-log/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.208744 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9559fbfd6-k4fwk_c180d7c5-ad61-4190-b709-6efe6a9a2434/barbican-keystone-listener/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.269430 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9559fbfd6-k4fwk_c180d7c5-ad61-4190-b709-6efe6a9a2434/barbican-keystone-listener-log/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.679985 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65cf4c8457-6ff7v_6ea83627-fed8-458c-a39b-f73e682799d3/barbican-worker/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.760923 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65cf4c8457-6ff7v_6ea83627-fed8-458c-a39b-f73e682799d3/barbican-worker-log/0.log" Dec 03 00:41:47 crc kubenswrapper[4903]: I1203 00:41:47.809836 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-lhvgc_270d4936-772f-40a2-8da3-f2651a216d6b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.045859 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/ceilometer-notification-agent/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.050562 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/ceilometer-central-agent/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.056439 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/proxy-httpd/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.194304 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ca41ddbc-b2e6-40d6-9ce6-c82bb6eba8ab/sg-core/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.338669 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f98dfcd8-1365-42c3-b939-c34ad3325a09/cinder-api-log/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.695482 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f98dfcd8-1365-42c3-b939-c34ad3325a09/cinder-api/0.log" Dec 03 00:41:48 crc kubenswrapper[4903]: I1203 00:41:48.894899 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29412001-pbz9p_334ce527-c86f-4991-bb5a-bb31f27acee1/cinder-db-purge/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.000319 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_864e9292-f08c-493e-8110-5ec88083fde2/probe/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.104506 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_864e9292-f08c-493e-8110-5ec88083fde2/cinder-backup/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.312807 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_abeb15a2-9a82-49c1-bfdc-bc65cd1920f0/cinder-scheduler/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.316411 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_abeb15a2-9a82-49c1-bfdc-bc65cd1920f0/probe/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.557962 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8/probe/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.637778 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_55f68daa-fcb3-4c4f-8ae4-84af8ac8b5a8/cinder-volume/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.763831 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_ce4112ef-fcb6-4722-acd0-45bf409867a7/cinder-volume/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.825178 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_ce4112ef-fcb6-4722-acd0-45bf409867a7/probe/0.log" Dec 03 00:41:49 crc kubenswrapper[4903]: I1203 00:41:49.889937 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9k7lq_c6f7512f-83fe-4921-9ccf-17a76752819f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.041212 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n7zlv_d6700daa-2dac-4779-a463-6aea7ae0d54a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.124744 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/init/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.402930 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2qpf7_8dff062c-2479-4ea6-994e-fea352cdf518/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.466251 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/init/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.577148 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bf48746b9-mb6br_609de84d-e5af-4d50-8852-655e6bbb30b9/dnsmasq-dns/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.723155 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29412001-9nk2t_ce1d9817-bff6-40a4-bc9b-fcbd1510739c/glance-dbpurge/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.881889 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ddc0e105-7645-48dc-9450-661c4ca40b01/glance-httpd/0.log" Dec 03 00:41:50 crc kubenswrapper[4903]: I1203 00:41:50.940026 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ddc0e105-7645-48dc-9450-661c4ca40b01/glance-log/0.log" Dec 03 00:41:51 crc kubenswrapper[4903]: I1203 00:41:51.087494 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3b2ebc8f-392e-4650-a033-a23cbe91436e/glance-httpd/0.log" Dec 03 00:41:51 crc kubenswrapper[4903]: I1203 00:41:51.118962 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3b2ebc8f-392e-4650-a033-a23cbe91436e/glance-log/0.log" Dec 03 00:41:51 crc kubenswrapper[4903]: I1203 00:41:51.310327 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7857f5d94d-4lclz_c5d26e7e-b21c-4e31-984f-768ef66e0772/horizon/0.log" Dec 03 00:41:51 crc kubenswrapper[4903]: I1203 00:41:51.399390 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pc895_b8a8af95-c502-4b50-a90e-682b039c6e58/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:51 crc kubenswrapper[4903]: I1203 00:41:51.614830 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p8wmx_346ac594-16d6-478e-9ce4-4d4acb116a99/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.075751 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7857f5d94d-4lclz_c5d26e7e-b21c-4e31-984f-768ef66e0772/horizon-log/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.130987 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412001-6287z_c445dbad-15ca-4171-ac03-0fd37dbdd474/keystone-cron/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.178927 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74c5f59c6f-5gx9d_d147d6c4-c17d-4e73-b8a3-efd87eb47f76/keystone-api/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.340398 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8a0962e8-541d-4a75-b629-613d6d19f47e/kube-state-metrics/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.423710 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6s9st_c9427e93-561b-4f09-bcec-00c7001f2541/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.952462 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f9c8dcd5-hbd9l_c7517345-0440-461c-a78d-a29ef04ecf9c/neutron-httpd/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.955706 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f9c8dcd5-hbd9l_c7517345-0440-461c-a78d-a29ef04ecf9c/neutron-api/0.log" Dec 03 00:41:52 crc kubenswrapper[4903]: I1203 00:41:52.958101 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jxqcz_c0b03ee1-07d8-4d8e-b047-480a4dd369f0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:53 crc kubenswrapper[4903]: I1203 00:41:53.649297 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_de5b4dd8-9abd-423d-af40-fed7d5fc1de0/nova-cell0-conductor-conductor/0.log" Dec 03 00:41:53 crc kubenswrapper[4903]: I1203 00:41:53.759828 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29412000-xgfnd_f8499f90-daef-4c46-90ef-36aba9557136/nova-manage/0.log" Dec 03 00:41:54 crc kubenswrapper[4903]: I1203 00:41:54.257786 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_66f413c7-2056-4a28-bf9f-9606dcaa5f78/nova-cell1-conductor-conductor/0.log" Dec 03 00:41:54 crc kubenswrapper[4903]: I1203 00:41:54.359952 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29412000-lw5h5_2131c673-5399-4093-92fd-c63b4ce2a8a5/nova-manage/0.log" Dec 03 00:41:54 crc kubenswrapper[4903]: I1203 00:41:54.820669 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_59776a3d-ba94-467b-9b25-2391269821e3/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 00:41:54 crc kubenswrapper[4903]: I1203 00:41:54.889761 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7235be43-b81b-4894-a75b-4c8444482eba/nova-api-log/0.log" Dec 03 00:41:55 crc kubenswrapper[4903]: I1203 00:41:55.017505 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-m6rln_013ce0d7-062b-47a7-8831-912380a94a37/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:55 crc kubenswrapper[4903]: I1203 00:41:55.171169 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a0018a95-dc74-4511-ade4-c77e4846f0a0/nova-metadata-log/0.log" Dec 03 00:41:55 crc kubenswrapper[4903]: I1203 00:41:55.256767 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7235be43-b81b-4894-a75b-4c8444482eba/nova-api-api/0.log" Dec 03 00:41:55 crc kubenswrapper[4903]: I1203 00:41:55.738659 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/mysql-bootstrap/0.log" Dec 03 00:41:55 crc kubenswrapper[4903]: I1203 00:41:55.994933 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/mysql-bootstrap/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.033643 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a3fa7901-a49c-433f-942c-a875c9ecd2ab/galera/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.117409 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dfe7a458-659b-465b-8ab9-712e3a865820/nova-scheduler-scheduler/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.290448 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/mysql-bootstrap/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.494039 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/galera/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.553615 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6eaac3fd-8033-42cd-90c3-5dfac716ae66/mysql-bootstrap/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.714588 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_01e0132f-dfe4-4d3a-9a72-b38b77521ada/openstackclient/0.log" Dec 03 00:41:56 crc kubenswrapper[4903]: I1203 00:41:56.775466 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lkt78_d72fba58-af32-4b1a-a883-4e76ec6dc3f4/ovn-controller/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.009864 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9fmcf_d7a25811-66de-4b62-ad27-f01f63f539a1/openstack-network-exporter/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.165521 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server-init/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.257618 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4fdb728a-100d-425d-b83c-245c770afa4b/memcached/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.536968 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server-init/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.556672 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovsdb-server/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.687417 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a0018a95-dc74-4511-ade4-c77e4846f0a0/nova-metadata-metadata/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.745273 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adcf8345-41bb-495c-a006-573f6afe5af9/openstack-network-exporter/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.759229 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hbltm_08be3078-8019-4472-8260-d24032d74b39/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.771926 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cs6mk_f72d79d2-cc88-4d82-abb4-c24c823532cb/ovs-vswitchd/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.898321 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adcf8345-41bb-495c-a006-573f6afe5af9/ovn-northd/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.953829 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_635dddd5-1a09-4f9e-b82f-e45eee76b412/openstack-network-exporter/0.log" Dec 03 00:41:57 crc kubenswrapper[4903]: I1203 00:41:57.970736 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_635dddd5-1a09-4f9e-b82f-e45eee76b412/ovsdbserver-nb/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.136376 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4ab22df5-5c0a-42c6-a881-4529dd331e5f/openstack-network-exporter/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.146499 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4ab22df5-5c0a-42c6-a881-4529dd331e5f/ovsdbserver-sb/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.369255 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/init-config-reloader/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.372206 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-665fcbdbd4-lvt55_4b492cef-e99c-4d41-a42b-7377908b5eed/placement-api/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.508354 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-665fcbdbd4-lvt55_4b492cef-e99c-4d41-a42b-7377908b5eed/placement-log/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.622261 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/init-config-reloader/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.634454 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/prometheus/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.670563 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/config-reloader/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.681762 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_61ac9e39-e707-4da2-881e-d9412cf9c136/thanos-sidecar/0.log" Dec 03 00:41:58 crc kubenswrapper[4903]: I1203 00:41:58.996702 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.094272 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.155231 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.222544 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46968896-fe5c-4bf2-a304-51f818ae9cc5/rabbitmq/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.367894 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.399515 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_adbb82a2-c30f-4e59-be9c-9274739caf25/rabbitmq/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.422471 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.647697 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/setup-container/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.669860 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-vw6h8_88d8ef39-c7d5-45d4-bd56-fbb4a23d0678/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.693298 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_96f2e452-05fe-45c6-940b-5a53959af002/rabbitmq/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.872152 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-crjgz_b5c4ae7e-90d7-4090-9357-77e09a38d4f6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.892386 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-v5ddf_d30d0995-2d8f-4cd1-aa29-2b3b6975f3c3/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:41:59 crc kubenswrapper[4903]: I1203 00:41:59.978847 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jhx2v_0b801338-6fdb-42ad-b3f8-67b296c04efd/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.115595 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5s77w_70d917fc-dbd8-499d-bcae-b5f324de77cb/ssh-known-hosts-edpm-deployment/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.264289 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5489fffdb5-zmhmz_0679a7f8-6bae-4619-b633-ae583358eda7/proxy-server/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.364032 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5489fffdb5-zmhmz_0679a7f8-6bae-4619-b633-ae583358eda7/proxy-httpd/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.387032 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gkjrc_f16a381a-80d3-4a60-be1b-e782dab1c73c/swift-ring-rebalance/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.500012 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-auditor/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.551742 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-reaper/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.574135 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-server/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.628704 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/account-replicator/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.703237 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-auditor/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.756549 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-updater/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.772474 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-replicator/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.782235 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/container-server/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.856943 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-auditor/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.897461 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-expirer/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.939183 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-server/0.log" Dec 03 00:42:00 crc kubenswrapper[4903]: I1203 00:42:00.968553 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-replicator/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.038462 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/object-updater/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.092530 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/swift-recon-cron/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.094720 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_55e0eb4b-69b7-4845-84aa-77dae4384f32/rsync/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.208423 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fkks7_2a2b87ac-e673-475f-9ebc-d3387b0e26f2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.342512 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0a6ff673-e552-4ffc-94a5-5b780fa219c0/tempest-tests-tempest-tests-runner/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.393965 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_950ed1bd-b32c-4e34-b973-3fdb5b2c0383/test-operator-logs-container/0.log" Dec 03 00:42:01 crc kubenswrapper[4903]: I1203 00:42:01.469773 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xf62w_d727ee19-e1d6-4421-9be6-94f429f93494/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 00:42:02 crc kubenswrapper[4903]: I1203 00:42:02.311721 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_6b000454-0ec3-4f51-ba7a-767530eaf03c/watcher-applier/0.log" Dec 03 00:42:02 crc kubenswrapper[4903]: I1203 00:42:02.767589 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_57a9a701-de78-4dc2-b8a7-365cd41a5693/watcher-api-log/0.log" Dec 03 00:42:05 crc kubenswrapper[4903]: I1203 00:42:05.030970 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_d92eb92f-06d0-4676-9c0f-9f3e427ae019/watcher-decision-engine/0.log" Dec 03 00:42:06 crc kubenswrapper[4903]: I1203 00:42:06.028271 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_57a9a701-de78-4dc2-b8a7-365cd41a5693/watcher-api/0.log" Dec 03 00:42:27 crc kubenswrapper[4903]: I1203 00:42:27.927203 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-tsj9r_58ddb811-8791-4420-ae35-b3521289b565/kube-rbac-proxy/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.093507 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-tsj9r_58ddb811-8791-4420-ae35-b3521289b565/manager/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.135207 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-j2zhw_35bd5361-6683-4c7d-b26c-3cac8e7a5bf4/kube-rbac-proxy/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.236371 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-j2zhw_35bd5361-6683-4c7d-b26c-3cac8e7a5bf4/manager/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.349988 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.698228 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.702852 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.720180 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.831131 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/util/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.862580 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/pull/0.log" Dec 03 00:42:28 crc kubenswrapper[4903]: I1203 00:42:28.897715 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d660994b277bbf54419412dd92322a982f81de04afa8bcc0ad29cf309cpnscb_0515602c-3cfc-4fe8-99ec-8a3d18e8f88d/extract/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.008935 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-ddkpk_8f5feda5-281a-4c4f-be95-7b96ecc273f9/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.042281 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-ddkpk_8f5feda5-281a-4c4f-be95-7b96ecc273f9/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.107293 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-78vhb_d0be2ea9-978d-4c79-a623-3b752547d546/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.263049 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-78vhb_d0be2ea9-978d-4c79-a623-3b752547d546/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.334209 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6dsjr_5046b326-aad3-4aa9-ad84-96b3943a6147/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.337422 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-6dsjr_5046b326-aad3-4aa9-ad84-96b3943a6147/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.494805 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lwgx2_d3c55b89-b070-410d-8436-a101b0f313cf/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.516974 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lwgx2_d3c55b89-b070-410d-8436-a101b0f313cf/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.671918 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dw6n2_e4de4a7c-49fd-48bc-8d5b-75727e7388de/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.798390 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-hj5mh_e3082dc8-ebbf-4a01-9120-5f1081af7801/kube-rbac-proxy/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.859710 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-hj5mh_e3082dc8-ebbf-4a01-9120-5f1081af7801/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.911453 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dw6n2_e4de4a7c-49fd-48bc-8d5b-75727e7388de/manager/0.log" Dec 03 00:42:29 crc kubenswrapper[4903]: I1203 00:42:29.997837 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s97rj_5c4ccdc6-6205-4108-9146-75a7a963732e/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.080773 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-s97rj_5c4ccdc6-6205-4108-9146-75a7a963732e/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.143740 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhm6n_7c596dd6-5f26-4bb7-a771-8c1d57129209/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.193881 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-xhm6n_7c596dd6-5f26-4bb7-a771-8c1d57129209/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.318073 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-6fhjd_7367c4a1-c098-4811-80ba-455509d27216/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.320162 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-6fhjd_7367c4a1-c098-4811-80ba-455509d27216/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.476764 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gm6wm_723460ec-3116-468b-a628-1b03f5fd4239/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.563140 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gm6wm_723460ec-3116-468b-a628-1b03f5fd4239/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.600836 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-phs84_d2216dc0-19da-4872-8e82-579f6bd60513/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.786260 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-phs84_d2216dc0-19da-4872-8e82-579f6bd60513/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.798766 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2rx8r_fc491fc5-9e88-4e1d-9848-ea8846acd82b/kube-rbac-proxy/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.816907 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2rx8r_fc491fc5-9e88-4e1d-9848-ea8846acd82b/manager/0.log" Dec 03 00:42:30 crc kubenswrapper[4903]: I1203 00:42:30.978546 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr_5a01f2d2-8c90-4ccc-bf47-a4f973276988/manager/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.033645 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4rx9jr_5a01f2d2-8c90-4ccc-bf47-a4f973276988/kube-rbac-proxy/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.381696 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-756b77799f-tcscw_de3babfe-054a-424f-8b40-e4e43d5f3e5b/operator/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.418512 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-scz9b_1e4a5768-36c7-4a71-8bf1-57f9ff69b940/registry-server/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.598380 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-htwmh_926767ef-1626-42a1-bd04-6d3f06d89f08/kube-rbac-proxy/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.694415 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-htwmh_926767ef-1626-42a1-bd04-6d3f06d89f08/manager/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.782892 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-j5jc6_246fe719-e899-408b-a962-702c5db22bfc/kube-rbac-proxy/0.log" Dec 03 00:42:31 crc kubenswrapper[4903]: I1203 00:42:31.868801 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-j5jc6_246fe719-e899-408b-a962-702c5db22bfc/manager/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.014364 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wccsg_9cc67e14-1cb4-497f-b0f8-010c2e6d5717/operator/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.113551 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5pdxv_057a4ce0-614e-436a-aaf5-300d5ce6661c/kube-rbac-proxy/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.184436 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-5pdxv_057a4ce0-614e-436a-aaf5-300d5ce6661c/manager/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.247626 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-k7zsl_5430813d-ed61-496d-86b6-c9cc1d48aa1f/kube-rbac-proxy/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.425612 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b9bc7567f-6prdr_61b2d273-f604-4fa0-baba-27dfbab9a350/manager/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.445548 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wp9kf_e6b63e17-4749-429b-8214-92fa7eecfd3c/kube-rbac-proxy/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.458488 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wp9kf_e6b63e17-4749-429b-8214-92fa7eecfd3c/manager/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.632814 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-k7zsl_5430813d-ed61-496d-86b6-c9cc1d48aa1f/manager/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.655230 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-c95d55f7c-jb8p7_dbed5f2e-6049-4adc-a31c-bad1f30c7058/kube-rbac-proxy/0.log" Dec 03 00:42:32 crc kubenswrapper[4903]: I1203 00:42:32.778820 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-c95d55f7c-jb8p7_dbed5f2e-6049-4adc-a31c-bad1f30c7058/manager/0.log" Dec 03 00:42:51 crc kubenswrapper[4903]: I1203 00:42:51.268258 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gdr6w_ea93257f-fe2e-4062-9b5d-3cd6f53f6fdc/control-plane-machine-set-operator/0.log" Dec 03 00:42:51 crc kubenswrapper[4903]: I1203 00:42:51.442361 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvnkb_ec790a4a-c562-4035-ba10-9ac0c8baf6c6/kube-rbac-proxy/0.log" Dec 03 00:42:51 crc kubenswrapper[4903]: I1203 00:42:51.475672 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvnkb_ec790a4a-c562-4035-ba10-9ac0c8baf6c6/machine-api-operator/0.log" Dec 03 00:42:53 crc kubenswrapper[4903]: I1203 00:42:53.069475 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:42:53 crc kubenswrapper[4903]: I1203 00:42:53.069973 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:43:04 crc kubenswrapper[4903]: I1203 00:43:04.468198 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-98zmz_245517d6-a256-48d4-8140-bd54f1794279/cert-manager-controller/0.log" Dec 03 00:43:04 crc kubenswrapper[4903]: I1203 00:43:04.628663 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qk25x_b4744872-fc92-4dc7-b64f-dbdc3c32c890/cert-manager-cainjector/0.log" Dec 03 00:43:04 crc kubenswrapper[4903]: I1203 00:43:04.684501 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-tq5jc_62990e39-3700-4b20-9668-d90e0074a402/cert-manager-webhook/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.377213 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-wwvz6_ef8f8d87-b435-4583-aa22-2e43892ce34b/nmstate-console-plugin/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.569478 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-smkbq_e904a523-8784-443d-b994-bb1aa11e45f4/nmstate-handler/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.612335 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8qx6t_09e08c1d-fdea-4255-accb-8c957d34cfa3/kube-rbac-proxy/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.628716 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8qx6t_09e08c1d-fdea-4255-accb-8c957d34cfa3/nmstate-metrics/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.819498 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-5lv4b_74c7c4ac-2173-497a-b630-d905326c4749/nmstate-webhook/0.log" Dec 03 00:43:17 crc kubenswrapper[4903]: I1203 00:43:17.833116 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gf4p6_524a5581-af2a-48b9-abd3-2f7c2d046b83/nmstate-operator/0.log" Dec 03 00:43:23 crc kubenswrapper[4903]: I1203 00:43:23.069618 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:43:23 crc kubenswrapper[4903]: I1203 00:43:23.070035 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.484800 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-b7sxc_3503b383-bf2b-4c83-8a43-3323f7330880/kube-rbac-proxy/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.629632 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-b7sxc_3503b383-bf2b-4c83-8a43-3323f7330880/controller/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.695947 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.935615 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.937970 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.984625 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:43:33 crc kubenswrapper[4903]: I1203 00:43:33.988407 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.134261 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.170770 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.194257 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.226060 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.392035 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-metrics/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.421708 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-reloader/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.430317 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/cp-frr-files/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.444022 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/controller/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.615523 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/frr-metrics/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.618853 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/kube-rbac-proxy/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.686381 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/kube-rbac-proxy-frr/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.831705 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/reloader/0.log" Dec 03 00:43:34 crc kubenswrapper[4903]: I1203 00:43:34.925186 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-6zx67_b8787de7-b1d1-41fc-bda7-628c8916c8c7/frr-k8s-webhook-server/0.log" Dec 03 00:43:35 crc kubenswrapper[4903]: I1203 00:43:35.051337 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b49745895-c8xsg_1548db98-cd62-4c58-88ac-4f4de9512edb/manager/0.log" Dec 03 00:43:35 crc kubenswrapper[4903]: I1203 00:43:35.254934 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64ddb78498-frglc_2f9aa142-f989-4748-946c-7629a225d6a4/webhook-server/0.log" Dec 03 00:43:35 crc kubenswrapper[4903]: I1203 00:43:35.341865 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pm26w_c48c624c-4ecb-47d7-affb-bf5527eec659/kube-rbac-proxy/0.log" Dec 03 00:43:35 crc kubenswrapper[4903]: I1203 00:43:35.907934 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pm26w_c48c624c-4ecb-47d7-affb-bf5527eec659/speaker/0.log" Dec 03 00:43:36 crc kubenswrapper[4903]: I1203 00:43:36.308148 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gh95g_93baaa1e-7108-463e-82c3-71abc3a34678/frr/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.555335 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.670342 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.707754 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.735359 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.920998 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/util/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.924303 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/pull/0.log" Dec 03 00:43:50 crc kubenswrapper[4903]: I1203 00:43:50.947614 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4jpgn_7901882b-863d-4308-8099-8a199965bdbe/extract/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.122507 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.265156 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.295769 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.310762 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.531172 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/util/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.535399 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/extract/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.551453 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210m7pbl_bd27baf0-b0a0-4ffd-a85f-0557c98a4996/pull/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.700339 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.887063 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.924624 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:43:51 crc kubenswrapper[4903]: I1203 00:43:51.963445 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.113241 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/extract/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.117614 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/pull/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.125844 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83tskss_6fb28505-94cf-4bc3-add6-f11756acc2b6/util/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.279307 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.468435 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.473720 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.500124 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.688685 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-content/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.704890 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/extract-utilities/0.log" Dec 03 00:43:52 crc kubenswrapper[4903]: I1203 00:43:52.962989 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.069558 4903 patch_prober.go:28] interesting pod/machine-config-daemon-snl4q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.069617 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.069688 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.070537 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3"} pod="openshift-machine-config-operator/machine-config-daemon-snl4q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.070599 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerName="machine-config-daemon" containerID="cri-o://b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" gracePeriod=600 Dec 03 00:43:53 crc kubenswrapper[4903]: E1203 00:43:53.203471 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.215719 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.251745 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.271628 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.404298 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dtsgk_716d2470-b915-4dc5-b728-8b5c047e4df6/registry-server/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.406567 4903 generic.go:334] "Generic (PLEG): container finished" podID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" exitCode=0 Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.406608 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerDied","Data":"b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3"} Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.406692 4903 scope.go:117] "RemoveContainer" containerID="90d1f718cdb853e4957a5cbc97aa94bf8e725212101e845c47286f852d749dfb" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.407446 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:43:53 crc kubenswrapper[4903]: E1203 00:43:53.407800 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.502910 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-content/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.507422 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/extract-utilities/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.775930 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cp8sg_b6ef141a-9183-423b-85e6-e7a02cc32267/marketplace-operator/0.log" Dec 03 00:43:53 crc kubenswrapper[4903]: I1203 00:43:53.963433 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.179552 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.179621 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.184138 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.515427 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-utilities/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.523141 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/extract-content/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.565984 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cz9pq_4232cb32-9d6c-400d-9deb-8fb5a18f20f8/registry-server/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.669761 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxtgc_68d3d00d-e9e1-4fd4-a290-2c49ec9aebaa/registry-server/0.log" Dec 03 00:43:54 crc kubenswrapper[4903]: I1203 00:43:54.685196 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.021962 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.051863 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.052005 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.201882 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-utilities/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.238703 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/extract-content/0.log" Dec 03 00:43:55 crc kubenswrapper[4903]: I1203 00:43:55.888866 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v87tp_2708b032-25bc-4098-9b51-71a186f0ac30/registry-server/0.log" Dec 03 00:44:06 crc kubenswrapper[4903]: I1203 00:44:06.613924 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:44:06 crc kubenswrapper[4903]: E1203 00:44:06.615382 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:44:08 crc kubenswrapper[4903]: I1203 00:44:08.978723 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-bk4xc_7dec0455-1e61-4cbc-893d-600ca1526f90/prometheus-operator/0.log" Dec 03 00:44:09 crc kubenswrapper[4903]: I1203 00:44:09.140269 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d68788f74-2nxbp_03107dd2-f5e5-4314-87ff-89c1f03811b2/prometheus-operator-admission-webhook/0.log" Dec 03 00:44:09 crc kubenswrapper[4903]: I1203 00:44:09.206501 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d68788f74-trl6v_9de21c8e-1da1-4105-83c9-c3a0d3fef062/prometheus-operator-admission-webhook/0.log" Dec 03 00:44:09 crc kubenswrapper[4903]: I1203 00:44:09.335950 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-4cm4x_6cc0aefd-91b2-432d-8564-ab955a89620a/operator/0.log" Dec 03 00:44:09 crc kubenswrapper[4903]: I1203 00:44:09.403823 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-4rrnv_c1730406-adf5-4f90-badf-6f40bec034eb/perses-operator/0.log" Dec 03 00:44:19 crc kubenswrapper[4903]: I1203 00:44:19.613138 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:44:19 crc kubenswrapper[4903]: E1203 00:44:19.613960 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:44:28 crc kubenswrapper[4903]: E1203 00:44:28.509907 4903 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:49486->38.102.83.39:43931: write tcp 38.102.83.39:49486->38.102.83.39:43931: write: broken pipe Dec 03 00:44:33 crc kubenswrapper[4903]: I1203 00:44:33.612913 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:44:33 crc kubenswrapper[4903]: E1203 00:44:33.613717 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:44:47 crc kubenswrapper[4903]: I1203 00:44:47.612913 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:44:47 crc kubenswrapper[4903]: E1203 00:44:47.614259 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.162082 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr"] Dec 03 00:45:00 crc kubenswrapper[4903]: E1203 00:45:00.163054 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e6d5fd-32fa-4711-acf0-c70da688cac4" containerName="container-00" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.163066 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e6d5fd-32fa-4711-acf0-c70da688cac4" containerName="container-00" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.163284 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e6d5fd-32fa-4711-acf0-c70da688cac4" containerName="container-00" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.164002 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.166946 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.167267 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.184522 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr"] Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.307100 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.307221 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj42g\" (UniqueName: \"kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.307254 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.408974 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj42g\" (UniqueName: \"kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.409032 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.409117 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.409952 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.419566 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.445110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj42g\" (UniqueName: \"kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g\") pod \"collect-profiles-29412045-nshmr\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.497463 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:00 crc kubenswrapper[4903]: I1203 00:45:00.981774 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr"] Dec 03 00:45:00 crc kubenswrapper[4903]: W1203 00:45:00.993522 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc39127_e79d_4793_b9a6_982d97769597.slice/crio-42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625 WatchSource:0}: Error finding container 42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625: Status 404 returned error can't find the container with id 42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625 Dec 03 00:45:01 crc kubenswrapper[4903]: I1203 00:45:01.185075 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" event={"ID":"1cc39127-e79d-4793-b9a6-982d97769597","Type":"ContainerStarted","Data":"5edf27acacb68c56464587ed98228d8c5fe448bc7648c792e4ae7a80a2841fbe"} Dec 03 00:45:01 crc kubenswrapper[4903]: I1203 00:45:01.185132 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" event={"ID":"1cc39127-e79d-4793-b9a6-982d97769597","Type":"ContainerStarted","Data":"42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625"} Dec 03 00:45:01 crc kubenswrapper[4903]: I1203 00:45:01.208680 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" podStartSLOduration=1.208632932 podStartE2EDuration="1.208632932s" podCreationTimestamp="2025-12-03 00:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 00:45:01.202960665 +0000 UTC m=+6439.911514958" watchObservedRunningTime="2025-12-03 00:45:01.208632932 +0000 UTC m=+6439.917187225" Dec 03 00:45:01 crc kubenswrapper[4903]: I1203 00:45:01.621493 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:45:01 crc kubenswrapper[4903]: E1203 00:45:01.622117 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:45:02 crc kubenswrapper[4903]: I1203 00:45:02.197628 4903 generic.go:334] "Generic (PLEG): container finished" podID="1cc39127-e79d-4793-b9a6-982d97769597" containerID="5edf27acacb68c56464587ed98228d8c5fe448bc7648c792e4ae7a80a2841fbe" exitCode=0 Dec 03 00:45:02 crc kubenswrapper[4903]: I1203 00:45:02.198030 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" event={"ID":"1cc39127-e79d-4793-b9a6-982d97769597","Type":"ContainerDied","Data":"5edf27acacb68c56464587ed98228d8c5fe448bc7648c792e4ae7a80a2841fbe"} Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.584001 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.678053 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume\") pod \"1cc39127-e79d-4793-b9a6-982d97769597\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.678141 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume\") pod \"1cc39127-e79d-4793-b9a6-982d97769597\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.678205 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj42g\" (UniqueName: \"kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g\") pod \"1cc39127-e79d-4793-b9a6-982d97769597\" (UID: \"1cc39127-e79d-4793-b9a6-982d97769597\") " Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.680956 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume" (OuterVolumeSpecName: "config-volume") pod "1cc39127-e79d-4793-b9a6-982d97769597" (UID: "1cc39127-e79d-4793-b9a6-982d97769597"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.701416 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1cc39127-e79d-4793-b9a6-982d97769597" (UID: "1cc39127-e79d-4793-b9a6-982d97769597"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.702518 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g" (OuterVolumeSpecName: "kube-api-access-kj42g") pod "1cc39127-e79d-4793-b9a6-982d97769597" (UID: "1cc39127-e79d-4793-b9a6-982d97769597"). InnerVolumeSpecName "kube-api-access-kj42g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.780497 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cc39127-e79d-4793-b9a6-982d97769597-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.780540 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc39127-e79d-4793-b9a6-982d97769597-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 00:45:03 crc kubenswrapper[4903]: I1203 00:45:03.780553 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj42g\" (UniqueName: \"kubernetes.io/projected/1cc39127-e79d-4793-b9a6-982d97769597-kube-api-access-kj42g\") on node \"crc\" DevicePath \"\"" Dec 03 00:45:04 crc kubenswrapper[4903]: I1203 00:45:04.224251 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" event={"ID":"1cc39127-e79d-4793-b9a6-982d97769597","Type":"ContainerDied","Data":"42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625"} Dec 03 00:45:04 crc kubenswrapper[4903]: I1203 00:45:04.224567 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42fc87b9622e069357bbcc45a5e3dcd8dffae678e26dbf1a4b315962cddd5625" Dec 03 00:45:04 crc kubenswrapper[4903]: I1203 00:45:04.224504 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412045-nshmr" Dec 03 00:45:04 crc kubenswrapper[4903]: I1203 00:45:04.299172 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2"] Dec 03 00:45:04 crc kubenswrapper[4903]: I1203 00:45:04.316349 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412000-9tfc2"] Dec 03 00:45:05 crc kubenswrapper[4903]: I1203 00:45:05.633094 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb38d1f-26ce-448f-8e8e-d694f3e98edd" path="/var/lib/kubelet/pods/7cb38d1f-26ce-448f-8e8e-d694f3e98edd/volumes" Dec 03 00:45:07 crc kubenswrapper[4903]: I1203 00:45:07.680471 4903 scope.go:117] "RemoveContainer" containerID="73effbfb9985c1b1d0d0c81a1e44b9e6ab68d1edee0be538bcc7f6e8809a59b9" Dec 03 00:45:14 crc kubenswrapper[4903]: I1203 00:45:14.614133 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:45:14 crc kubenswrapper[4903]: E1203 00:45:14.614810 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:45:27 crc kubenswrapper[4903]: I1203 00:45:27.613295 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:45:27 crc kubenswrapper[4903]: E1203 00:45:27.614161 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:45:39 crc kubenswrapper[4903]: I1203 00:45:39.618140 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:45:39 crc kubenswrapper[4903]: E1203 00:45:39.619442 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:45:52 crc kubenswrapper[4903]: I1203 00:45:52.612070 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:45:52 crc kubenswrapper[4903]: E1203 00:45:52.613841 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:46:07 crc kubenswrapper[4903]: I1203 00:46:07.613752 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:46:07 crc kubenswrapper[4903]: E1203 00:46:07.637603 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:46:08 crc kubenswrapper[4903]: I1203 00:46:08.006626 4903 generic.go:334] "Generic (PLEG): container finished" podID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerID="bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab" exitCode=0 Dec 03 00:46:08 crc kubenswrapper[4903]: I1203 00:46:08.006740 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" event={"ID":"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc","Type":"ContainerDied","Data":"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab"} Dec 03 00:46:08 crc kubenswrapper[4903]: I1203 00:46:08.008025 4903 scope.go:117] "RemoveContainer" containerID="bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab" Dec 03 00:46:08 crc kubenswrapper[4903]: I1203 00:46:08.890750 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vn4gt_must-gather-9ntbg_d67acc3f-33a9-4d90-9ecc-ae53be48e1dc/gather/0.log" Dec 03 00:46:21 crc kubenswrapper[4903]: I1203 00:46:21.564726 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vn4gt/must-gather-9ntbg"] Dec 03 00:46:21 crc kubenswrapper[4903]: I1203 00:46:21.565566 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="copy" containerID="cri-o://a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401" gracePeriod=2 Dec 03 00:46:21 crc kubenswrapper[4903]: I1203 00:46:21.575064 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vn4gt/must-gather-9ntbg"] Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.002110 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vn4gt_must-gather-9ntbg_d67acc3f-33a9-4d90-9ecc-ae53be48e1dc/copy/0.log" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.003305 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.120112 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb92v\" (UniqueName: \"kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v\") pod \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.120267 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output\") pod \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\" (UID: \"d67acc3f-33a9-4d90-9ecc-ae53be48e1dc\") " Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.153728 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v" (OuterVolumeSpecName: "kube-api-access-pb92v") pod "d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" (UID: "d67acc3f-33a9-4d90-9ecc-ae53be48e1dc"). InnerVolumeSpecName "kube-api-access-pb92v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.226783 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb92v\" (UniqueName: \"kubernetes.io/projected/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-kube-api-access-pb92v\") on node \"crc\" DevicePath \"\"" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.228094 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vn4gt_must-gather-9ntbg_d67acc3f-33a9-4d90-9ecc-ae53be48e1dc/copy/0.log" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.228753 4903 generic.go:334] "Generic (PLEG): container finished" podID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerID="a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401" exitCode=143 Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.228812 4903 scope.go:117] "RemoveContainer" containerID="a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.228895 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vn4gt/must-gather-9ntbg" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.265065 4903 scope.go:117] "RemoveContainer" containerID="bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.304086 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" (UID: "d67acc3f-33a9-4d90-9ecc-ae53be48e1dc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.321263 4903 scope.go:117] "RemoveContainer" containerID="a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401" Dec 03 00:46:22 crc kubenswrapper[4903]: E1203 00:46:22.322015 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401\": container with ID starting with a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401 not found: ID does not exist" containerID="a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.322048 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401"} err="failed to get container status \"a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401\": rpc error: code = NotFound desc = could not find container \"a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401\": container with ID starting with a73602912c79333fb267945174954cabb279b6f6d475891f3e39aa95f36f1401 not found: ID does not exist" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.322067 4903 scope.go:117] "RemoveContainer" containerID="bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab" Dec 03 00:46:22 crc kubenswrapper[4903]: E1203 00:46:22.322392 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab\": container with ID starting with bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab not found: ID does not exist" containerID="bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.322443 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab"} err="failed to get container status \"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab\": rpc error: code = NotFound desc = could not find container \"bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab\": container with ID starting with bab92c3e220006c2fd364d739e973ee717f16d3c97f5f67c3a376a2fc36ca4ab not found: ID does not exist" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.329043 4903 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 00:46:22 crc kubenswrapper[4903]: I1203 00:46:22.613265 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:46:22 crc kubenswrapper[4903]: E1203 00:46:22.613696 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:46:23 crc kubenswrapper[4903]: I1203 00:46:23.622901 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" path="/var/lib/kubelet/pods/d67acc3f-33a9-4d90-9ecc-ae53be48e1dc/volumes" Dec 03 00:46:33 crc kubenswrapper[4903]: I1203 00:46:33.612394 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:46:33 crc kubenswrapper[4903]: E1203 00:46:33.613446 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:46:47 crc kubenswrapper[4903]: I1203 00:46:47.612217 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:46:47 crc kubenswrapper[4903]: E1203 00:46:47.612961 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.118014 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:00 crc kubenswrapper[4903]: E1203 00:47:00.119069 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="gather" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119085 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="gather" Dec 03 00:47:00 crc kubenswrapper[4903]: E1203 00:47:00.119108 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc39127-e79d-4793-b9a6-982d97769597" containerName="collect-profiles" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119114 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc39127-e79d-4793-b9a6-982d97769597" containerName="collect-profiles" Dec 03 00:47:00 crc kubenswrapper[4903]: E1203 00:47:00.119145 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="copy" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119151 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="copy" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119333 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="copy" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119360 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc39127-e79d-4793-b9a6-982d97769597" containerName="collect-profiles" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.119372 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67acc3f-33a9-4d90-9ecc-ae53be48e1dc" containerName="gather" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.122754 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.134310 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.273607 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.274155 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.274380 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drblv\" (UniqueName: \"kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.376110 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.377531 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drblv\" (UniqueName: \"kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.378136 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.376563 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.378478 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.402998 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drblv\" (UniqueName: \"kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv\") pod \"redhat-operators-w49l4\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.445306 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:00 crc kubenswrapper[4903]: I1203 00:47:00.819857 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:01 crc kubenswrapper[4903]: I1203 00:47:01.621616 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:47:01 crc kubenswrapper[4903]: E1203 00:47:01.621993 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:47:01 crc kubenswrapper[4903]: I1203 00:47:01.646753 4903 generic.go:334] "Generic (PLEG): container finished" podID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerID="189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf" exitCode=0 Dec 03 00:47:01 crc kubenswrapper[4903]: I1203 00:47:01.646793 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerDied","Data":"189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf"} Dec 03 00:47:01 crc kubenswrapper[4903]: I1203 00:47:01.646818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerStarted","Data":"fd13451eded4231b137d68ed5fbaa81d239a509e67c47e539574de724137c434"} Dec 03 00:47:01 crc kubenswrapper[4903]: I1203 00:47:01.648975 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 00:47:03 crc kubenswrapper[4903]: I1203 00:47:03.670700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerStarted","Data":"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6"} Dec 03 00:47:04 crc kubenswrapper[4903]: I1203 00:47:04.682445 4903 generic.go:334] "Generic (PLEG): container finished" podID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerID="3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6" exitCode=0 Dec 03 00:47:04 crc kubenswrapper[4903]: I1203 00:47:04.682478 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerDied","Data":"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6"} Dec 03 00:47:05 crc kubenswrapper[4903]: I1203 00:47:05.697994 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerStarted","Data":"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91"} Dec 03 00:47:05 crc kubenswrapper[4903]: I1203 00:47:05.731368 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w49l4" podStartSLOduration=2.2230356430000002 podStartE2EDuration="5.731341847s" podCreationTimestamp="2025-12-03 00:47:00 +0000 UTC" firstStartedPulling="2025-12-03 00:47:01.648780944 +0000 UTC m=+6560.357335227" lastFinishedPulling="2025-12-03 00:47:05.157087148 +0000 UTC m=+6563.865641431" observedRunningTime="2025-12-03 00:47:05.727641247 +0000 UTC m=+6564.436195530" watchObservedRunningTime="2025-12-03 00:47:05.731341847 +0000 UTC m=+6564.439896160" Dec 03 00:47:07 crc kubenswrapper[4903]: I1203 00:47:07.774601 4903 scope.go:117] "RemoveContainer" containerID="3fba698e82a8f82e77bff5e33708cc2da4c1d6000ffcacb2ebd171c60a6c3aea" Dec 03 00:47:07 crc kubenswrapper[4903]: I1203 00:47:07.803982 4903 scope.go:117] "RemoveContainer" containerID="97d0fc63998e4ae367314351dec38cb4b0c58dd4b260cc4ff8cc1a806d9b04d1" Dec 03 00:47:07 crc kubenswrapper[4903]: I1203 00:47:07.852022 4903 scope.go:117] "RemoveContainer" containerID="681f4eb1eea154be6a0ff032a8e793cb6ad05b13cf5003349b09b121a2da46e0" Dec 03 00:47:07 crc kubenswrapper[4903]: I1203 00:47:07.869566 4903 scope.go:117] "RemoveContainer" containerID="0909e3cf1eae474ec61b1f3d7ded8d0fdbfaf27836f9fbda25c2831234ce3131" Dec 03 00:47:07 crc kubenswrapper[4903]: I1203 00:47:07.921227 4903 scope.go:117] "RemoveContainer" containerID="307e5df14806344fa932b63f225e6e76a7b50f6f80823c3a8373398108a666db" Dec 03 00:47:10 crc kubenswrapper[4903]: I1203 00:47:10.445847 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:10 crc kubenswrapper[4903]: I1203 00:47:10.446312 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:11 crc kubenswrapper[4903]: I1203 00:47:11.536977 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w49l4" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="registry-server" probeResult="failure" output=< Dec 03 00:47:11 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Dec 03 00:47:11 crc kubenswrapper[4903]: > Dec 03 00:47:15 crc kubenswrapper[4903]: I1203 00:47:15.613638 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:47:15 crc kubenswrapper[4903]: E1203 00:47:15.616419 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:47:20 crc kubenswrapper[4903]: I1203 00:47:20.525435 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:20 crc kubenswrapper[4903]: I1203 00:47:20.617892 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:20 crc kubenswrapper[4903]: I1203 00:47:20.783285 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:21 crc kubenswrapper[4903]: I1203 00:47:21.902130 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w49l4" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="registry-server" containerID="cri-o://4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91" gracePeriod=2 Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.575764 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.687596 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities\") pod \"639bd82b-de37-46da-a99c-396f9a5d5b2e\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.687861 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drblv\" (UniqueName: \"kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv\") pod \"639bd82b-de37-46da-a99c-396f9a5d5b2e\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.687889 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content\") pod \"639bd82b-de37-46da-a99c-396f9a5d5b2e\" (UID: \"639bd82b-de37-46da-a99c-396f9a5d5b2e\") " Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.688566 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities" (OuterVolumeSpecName: "utilities") pod "639bd82b-de37-46da-a99c-396f9a5d5b2e" (UID: "639bd82b-de37-46da-a99c-396f9a5d5b2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.693098 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv" (OuterVolumeSpecName: "kube-api-access-drblv") pod "639bd82b-de37-46da-a99c-396f9a5d5b2e" (UID: "639bd82b-de37-46da-a99c-396f9a5d5b2e"). InnerVolumeSpecName "kube-api-access-drblv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.790045 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drblv\" (UniqueName: \"kubernetes.io/projected/639bd82b-de37-46da-a99c-396f9a5d5b2e-kube-api-access-drblv\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.790073 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.803101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "639bd82b-de37-46da-a99c-396f9a5d5b2e" (UID: "639bd82b-de37-46da-a99c-396f9a5d5b2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.892245 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639bd82b-de37-46da-a99c-396f9a5d5b2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.916579 4903 generic.go:334] "Generic (PLEG): container finished" podID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerID="4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91" exitCode=0 Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.916713 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w49l4" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.916687 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerDied","Data":"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91"} Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.916897 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w49l4" event={"ID":"639bd82b-de37-46da-a99c-396f9a5d5b2e","Type":"ContainerDied","Data":"fd13451eded4231b137d68ed5fbaa81d239a509e67c47e539574de724137c434"} Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.916917 4903 scope.go:117] "RemoveContainer" containerID="4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.961102 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.964142 4903 scope.go:117] "RemoveContainer" containerID="3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6" Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.974593 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w49l4"] Dec 03 00:47:22 crc kubenswrapper[4903]: I1203 00:47:22.997211 4903 scope.go:117] "RemoveContainer" containerID="189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.060106 4903 scope.go:117] "RemoveContainer" containerID="4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91" Dec 03 00:47:23 crc kubenswrapper[4903]: E1203 00:47:23.062575 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91\": container with ID starting with 4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91 not found: ID does not exist" containerID="4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.062644 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91"} err="failed to get container status \"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91\": rpc error: code = NotFound desc = could not find container \"4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91\": container with ID starting with 4c1770a0515f30e30f563af48f9150dec2a321ea0f0be67cd2c892e08aec6d91 not found: ID does not exist" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.062717 4903 scope.go:117] "RemoveContainer" containerID="3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6" Dec 03 00:47:23 crc kubenswrapper[4903]: E1203 00:47:23.063517 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6\": container with ID starting with 3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6 not found: ID does not exist" containerID="3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.063565 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6"} err="failed to get container status \"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6\": rpc error: code = NotFound desc = could not find container \"3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6\": container with ID starting with 3a6b6119af56f858a254179de69de9f519456d67fcff8972714e9814a96cfdd6 not found: ID does not exist" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.063591 4903 scope.go:117] "RemoveContainer" containerID="189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf" Dec 03 00:47:23 crc kubenswrapper[4903]: E1203 00:47:23.064197 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf\": container with ID starting with 189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf not found: ID does not exist" containerID="189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.064226 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf"} err="failed to get container status \"189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf\": rpc error: code = NotFound desc = could not find container \"189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf\": container with ID starting with 189a51100b48a300fd023e86d583b5391630ac62100059f7fce002d928d121bf not found: ID does not exist" Dec 03 00:47:23 crc kubenswrapper[4903]: I1203 00:47:23.629552 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" path="/var/lib/kubelet/pods/639bd82b-de37-46da-a99c-396f9a5d5b2e/volumes" Dec 03 00:47:28 crc kubenswrapper[4903]: I1203 00:47:28.612908 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:47:28 crc kubenswrapper[4903]: E1203 00:47:28.613781 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.370770 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:37 crc kubenswrapper[4903]: E1203 00:47:37.371794 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="registry-server" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.371812 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="registry-server" Dec 03 00:47:37 crc kubenswrapper[4903]: E1203 00:47:37.371839 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="extract-utilities" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.371847 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="extract-utilities" Dec 03 00:47:37 crc kubenswrapper[4903]: E1203 00:47:37.371881 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="extract-content" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.371889 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="extract-content" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.372146 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="639bd82b-de37-46da-a99c-396f9a5d5b2e" containerName="registry-server" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.374021 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.398315 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.502080 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znkjx\" (UniqueName: \"kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.502167 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.502526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.605122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.605376 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znkjx\" (UniqueName: \"kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.605467 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.605800 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.606152 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.628184 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znkjx\" (UniqueName: \"kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx\") pod \"redhat-marketplace-xxvss\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:37 crc kubenswrapper[4903]: I1203 00:47:37.704251 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:38 crc kubenswrapper[4903]: I1203 00:47:38.273100 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:39 crc kubenswrapper[4903]: I1203 00:47:39.176536 4903 generic.go:334] "Generic (PLEG): container finished" podID="861fedaf-0e79-4d08-8aaf-70a49e243897" containerID="12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1" exitCode=0 Dec 03 00:47:39 crc kubenswrapper[4903]: I1203 00:47:39.176587 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerDied","Data":"12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1"} Dec 03 00:47:39 crc kubenswrapper[4903]: I1203 00:47:39.176991 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerStarted","Data":"ce53d5b4a1ccb5a4397e387447bf35777cfbe05c27e7b5b69540a8f0f573139b"} Dec 03 00:47:40 crc kubenswrapper[4903]: I1203 00:47:40.192355 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerStarted","Data":"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66"} Dec 03 00:47:41 crc kubenswrapper[4903]: I1203 00:47:41.203730 4903 generic.go:334] "Generic (PLEG): container finished" podID="861fedaf-0e79-4d08-8aaf-70a49e243897" containerID="78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66" exitCode=0 Dec 03 00:47:41 crc kubenswrapper[4903]: I1203 00:47:41.203844 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerDied","Data":"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66"} Dec 03 00:47:42 crc kubenswrapper[4903]: I1203 00:47:42.217983 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerStarted","Data":"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d"} Dec 03 00:47:42 crc kubenswrapper[4903]: I1203 00:47:42.271213 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxvss" podStartSLOduration=2.817539087 podStartE2EDuration="5.27118288s" podCreationTimestamp="2025-12-03 00:47:37 +0000 UTC" firstStartedPulling="2025-12-03 00:47:39.178824293 +0000 UTC m=+6597.887378606" lastFinishedPulling="2025-12-03 00:47:41.632468076 +0000 UTC m=+6600.341022399" observedRunningTime="2025-12-03 00:47:42.252338955 +0000 UTC m=+6600.960893248" watchObservedRunningTime="2025-12-03 00:47:42.27118288 +0000 UTC m=+6600.979737173" Dec 03 00:47:43 crc kubenswrapper[4903]: I1203 00:47:43.612582 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:47:43 crc kubenswrapper[4903]: E1203 00:47:43.613222 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:47:47 crc kubenswrapper[4903]: I1203 00:47:47.704473 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:47 crc kubenswrapper[4903]: I1203 00:47:47.705554 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:47 crc kubenswrapper[4903]: I1203 00:47:47.777069 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:48 crc kubenswrapper[4903]: I1203 00:47:48.370936 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:48 crc kubenswrapper[4903]: I1203 00:47:48.458568 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.309601 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xxvss" podUID="861fedaf-0e79-4d08-8aaf-70a49e243897" containerName="registry-server" containerID="cri-o://636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d" gracePeriod=2 Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.819918 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.913988 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znkjx\" (UniqueName: \"kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx\") pod \"861fedaf-0e79-4d08-8aaf-70a49e243897\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.914483 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities\") pod \"861fedaf-0e79-4d08-8aaf-70a49e243897\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.914542 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content\") pod \"861fedaf-0e79-4d08-8aaf-70a49e243897\" (UID: \"861fedaf-0e79-4d08-8aaf-70a49e243897\") " Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.915480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities" (OuterVolumeSpecName: "utilities") pod "861fedaf-0e79-4d08-8aaf-70a49e243897" (UID: "861fedaf-0e79-4d08-8aaf-70a49e243897"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.928874 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx" (OuterVolumeSpecName: "kube-api-access-znkjx") pod "861fedaf-0e79-4d08-8aaf-70a49e243897" (UID: "861fedaf-0e79-4d08-8aaf-70a49e243897"). InnerVolumeSpecName "kube-api-access-znkjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 00:47:50 crc kubenswrapper[4903]: I1203 00:47:50.935358 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "861fedaf-0e79-4d08-8aaf-70a49e243897" (UID: "861fedaf-0e79-4d08-8aaf-70a49e243897"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.015923 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.015959 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/861fedaf-0e79-4d08-8aaf-70a49e243897-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.015974 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znkjx\" (UniqueName: \"kubernetes.io/projected/861fedaf-0e79-4d08-8aaf-70a49e243897-kube-api-access-znkjx\") on node \"crc\" DevicePath \"\"" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.325985 4903 generic.go:334] "Generic (PLEG): container finished" podID="861fedaf-0e79-4d08-8aaf-70a49e243897" containerID="636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d" exitCode=0 Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.326058 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxvss" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.326138 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerDied","Data":"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d"} Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.326217 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxvss" event={"ID":"861fedaf-0e79-4d08-8aaf-70a49e243897","Type":"ContainerDied","Data":"ce53d5b4a1ccb5a4397e387447bf35777cfbe05c27e7b5b69540a8f0f573139b"} Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.326262 4903 scope.go:117] "RemoveContainer" containerID="636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.370976 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.371870 4903 scope.go:117] "RemoveContainer" containerID="78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.442011 4903 scope.go:117] "RemoveContainer" containerID="12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.442547 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxvss"] Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.497012 4903 scope.go:117] "RemoveContainer" containerID="636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d" Dec 03 00:47:51 crc kubenswrapper[4903]: E1203 00:47:51.497595 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d\": container with ID starting with 636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d not found: ID does not exist" containerID="636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.497645 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d"} err="failed to get container status \"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d\": rpc error: code = NotFound desc = could not find container \"636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d\": container with ID starting with 636ad0403fb60a81393b9e87c80c01294c0fbe5d83e9bf1f91a91c0aa9e1794d not found: ID does not exist" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.497691 4903 scope.go:117] "RemoveContainer" containerID="78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66" Dec 03 00:47:51 crc kubenswrapper[4903]: E1203 00:47:51.498134 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66\": container with ID starting with 78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66 not found: ID does not exist" containerID="78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.498161 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66"} err="failed to get container status \"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66\": rpc error: code = NotFound desc = could not find container \"78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66\": container with ID starting with 78b6ec25d35f88a83b8592b4ee734b4955da4ef0c11f4e9d92cb189b31a70d66 not found: ID does not exist" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.498180 4903 scope.go:117] "RemoveContainer" containerID="12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1" Dec 03 00:47:51 crc kubenswrapper[4903]: E1203 00:47:51.498525 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1\": container with ID starting with 12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1 not found: ID does not exist" containerID="12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.498549 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1"} err="failed to get container status \"12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1\": rpc error: code = NotFound desc = could not find container \"12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1\": container with ID starting with 12b365f82e9a601c7b1d3e02d42e27ad97076a7f2abf6d3f437848c60c9c29a1 not found: ID does not exist" Dec 03 00:47:51 crc kubenswrapper[4903]: I1203 00:47:51.631393 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861fedaf-0e79-4d08-8aaf-70a49e243897" path="/var/lib/kubelet/pods/861fedaf-0e79-4d08-8aaf-70a49e243897/volumes" Dec 03 00:47:58 crc kubenswrapper[4903]: I1203 00:47:58.613560 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:47:58 crc kubenswrapper[4903]: E1203 00:47:58.614621 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:48:12 crc kubenswrapper[4903]: I1203 00:48:12.613081 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:48:12 crc kubenswrapper[4903]: E1203 00:48:12.613819 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:48:27 crc kubenswrapper[4903]: I1203 00:48:27.614867 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:48:27 crc kubenswrapper[4903]: E1203 00:48:27.616281 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:48:39 crc kubenswrapper[4903]: I1203 00:48:39.613302 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:48:39 crc kubenswrapper[4903]: E1203 00:48:39.614389 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-snl4q_openshift-machine-config-operator(3ef11e3b-7757-4286-9684-6d4cd3bf924f)\"" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" podUID="3ef11e3b-7757-4286-9684-6d4cd3bf924f" Dec 03 00:48:53 crc kubenswrapper[4903]: I1203 00:48:53.612848 4903 scope.go:117] "RemoveContainer" containerID="b09207d5035946cdfbdbe4c7a2f34bdcfece73162faa7631afa02b7b1a1f24e3" Dec 03 00:48:54 crc kubenswrapper[4903]: I1203 00:48:54.104673 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-snl4q" event={"ID":"3ef11e3b-7757-4286-9684-6d4cd3bf924f","Type":"ContainerStarted","Data":"4a762e23ad8d6f19d147583f0fdf7e165b8bca0f1db491a983a99322d2f865bc"}